• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 8
  • 1
  • Tagged with
  • 11
  • 11
  • 6
  • 6
  • 5
  • 4
  • 4
  • 4
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Distributed Random Set Theoretic Soft/Hard Data Fusion

Khaleghi, Bahador January 2012 (has links)
Research on multisensor data fusion aims at providing the enabling technology to combine information from several sources in order to form a unifi ed picture. The literature work on fusion of conventional data provided by non-human (hard) sensors is vast and well-established. In comparison to conventional fusion systems where input data are generated by calibrated electronic sensor systems with well-defi ned characteristics, research on soft data fusion considers combining human-based data expressed preferably in unconstrained natural language form. Fusion of soft and hard data is even more challenging, yet necessary in some applications, and has received little attention in the past. Due to being a rather new area of research, soft/hard data fusion is still in a edging stage with even its challenging problems yet to be adequately de fined and explored. This dissertation develops a framework to enable fusion of both soft and hard data with the Random Set (RS) theory as the underlying mathematical foundation. Random set theory is an emerging theory within the data fusion community that, due to its powerful representational and computational capabilities, is gaining more and more attention among the data fusion researchers. Motivated by the unique characteristics of the random set theory and the main challenge of soft/hard data fusion systems, i.e. the need for a unifying framework capable of processing both unconventional soft data and conventional hard data, this dissertation argues in favor of a random set theoretic approach as the first step towards realizing a soft/hard data fusion framework. Several challenging problems related to soft/hard fusion systems are addressed in the proposed framework. First, an extension of the well-known Kalman lter within random set theory, called Kalman evidential filter (KEF), is adopted as a common data processing framework for both soft and hard data. Second, a novel ontology (syntax+semantics) is developed to allow for modeling soft (human-generated) data assuming target tracking as the application. Third, as soft/hard data fusion is mostly aimed at large networks of information processing, a new approach is proposed to enable distributed estimation of soft, as well as hard data, addressing the scalability requirement of such fusion systems. Fourth, a method for modeling trust in the human agents is developed, which enables the fusion system to protect itself from erroneous/misleading soft data through discounting such data on-the-fly. Fifth, leveraging the recent developments in the RS theoretic data fusion literature a novel soft data association algorithm is developed and deployed to extend the proposed target tracking framework into multi-target tracking case. Finally, the multi-target tracking framework is complemented by introducing a distributed classi fication approach applicable to target classes described with soft human-generated data. In addition, this dissertation presents a novel data-centric taxonomy of data fusion methodologies. In particular, several categories of fusion algorithms have been identifi ed and discussed based on the data-related challenging aspect(s) addressed. It is intended to provide the reader with a generic and comprehensive view of the contemporary data fusion literature, which could also serve as a reference for data fusion practitioners by providing them with conducive design guidelines, in terms of algorithm choice, regarding the specifi c data-related challenges expected in a given application.
2

Distributed Random Set Theoretic Soft/Hard Data Fusion

Khaleghi, Bahador January 2012 (has links)
Research on multisensor data fusion aims at providing the enabling technology to combine information from several sources in order to form a unifi ed picture. The literature work on fusion of conventional data provided by non-human (hard) sensors is vast and well-established. In comparison to conventional fusion systems where input data are generated by calibrated electronic sensor systems with well-defi ned characteristics, research on soft data fusion considers combining human-based data expressed preferably in unconstrained natural language form. Fusion of soft and hard data is even more challenging, yet necessary in some applications, and has received little attention in the past. Due to being a rather new area of research, soft/hard data fusion is still in a edging stage with even its challenging problems yet to be adequately de fined and explored. This dissertation develops a framework to enable fusion of both soft and hard data with the Random Set (RS) theory as the underlying mathematical foundation. Random set theory is an emerging theory within the data fusion community that, due to its powerful representational and computational capabilities, is gaining more and more attention among the data fusion researchers. Motivated by the unique characteristics of the random set theory and the main challenge of soft/hard data fusion systems, i.e. the need for a unifying framework capable of processing both unconventional soft data and conventional hard data, this dissertation argues in favor of a random set theoretic approach as the first step towards realizing a soft/hard data fusion framework. Several challenging problems related to soft/hard fusion systems are addressed in the proposed framework. First, an extension of the well-known Kalman lter within random set theory, called Kalman evidential filter (KEF), is adopted as a common data processing framework for both soft and hard data. Second, a novel ontology (syntax+semantics) is developed to allow for modeling soft (human-generated) data assuming target tracking as the application. Third, as soft/hard data fusion is mostly aimed at large networks of information processing, a new approach is proposed to enable distributed estimation of soft, as well as hard data, addressing the scalability requirement of such fusion systems. Fourth, a method for modeling trust in the human agents is developed, which enables the fusion system to protect itself from erroneous/misleading soft data through discounting such data on-the-fly. Fifth, leveraging the recent developments in the RS theoretic data fusion literature a novel soft data association algorithm is developed and deployed to extend the proposed target tracking framework into multi-target tracking case. Finally, the multi-target tracking framework is complemented by introducing a distributed classi fication approach applicable to target classes described with soft human-generated data. In addition, this dissertation presents a novel data-centric taxonomy of data fusion methodologies. In particular, several categories of fusion algorithms have been identifi ed and discussed based on the data-related challenging aspect(s) addressed. It is intended to provide the reader with a generic and comprehensive view of the contemporary data fusion literature, which could also serve as a reference for data fusion practitioners by providing them with conducive design guidelines, in terms of algorithm choice, regarding the specifi c data-related challenges expected in a given application.
3

Sharp Concentration of Hitting Size for Random Set Systems

D. Jamieson, Jessie, Godbole, Anant, Jamieson, William, Petito, Lucia 01 May 2015 (has links)
Consider the random set system (Formula presented.), where (Formula presented.) and Ajselected with probabilityp=pn}. A set H⊆[n] is said to be a hitting set for (Formula presented.). The second moment method is used to exhibit the sharp concentration of the minimal size of H for a variety of values of p.
4

Náhodné uzavřené množiny a procesy částic / Random closed sets and particle processes

Stroganov, Vladimír January 2014 (has links)
In this thesis we are concerned with representation of random closed sets in Rd with values concentrated on a space UX of locally finite unions of sets from a given class X ⊂ F. We examine existence of their repre- sentations with particle processes on the same space X, which keep invariance to rigid motions, which the initial random set was invariant to. We discuss existence of such representations for selected practically applicable spaces X: we go through the known results for convex sets and introduce new proofs for cases of sets with positive reach and for smooth k-dimensional submanifolds. Beside that we present series of general results related to representation of random UX sets. 1
5

Náhodné měřitelné množiny a procesy částic / Random measurable sets and particle processes

Jurčo, Adam January 2021 (has links)
Random measurable sets and particle processes Adam Jurčo Abstract In this thesis we deal with particle processes on more general spaces. First we in- troduce the space of Lebesgue measurable sets represented by indicator functions with topology given by L1 loc convergence. We the explore the topological properties of this space and its subspaces of sets of finite and locally finite perimeter. As these spaces do not satisfy the usual topological assumptions needed for construction of point processes we use another approach based on measure-theoretic assumptions. This will allow us to define point processes given by finite dimensional distributions on measurable subsets of the space of Lebesgue-measurable sets. Then we will derive a formula for a volume fraction of a Boolean process defined in this more general setting. Further we introduce a Boolean process with particles of finite perimeter and derive a formula for its specific perimeter. 1
6

Modeling Wireless Networks for Rate Control

Ripplinger, David C. 22 July 2011 (has links) (PDF)
Congestion control algorithms for wireless networks are often designed based on a model of the wireless network and its corresponding network utility maximization (NUM) problem. The NUM problem is important to researchers and industry because the wireless medium is a scarce resource, and currently operating protocols such as 802.11 often result in extremely unfair allocation of data rates. The NUM approach offers a systematic framework to build rate control protocols that guarantee fair, optimal rates. However, classical models used with the NUM approach do not incorporate partial carrier sensing and interference, which can lead to significantly suboptimal performance when actually deployed. We quantify the potential performance loss of the classical controllers by developing a new model for wireless networks, called the first-principles model, that accounts for partial carrier sensing and interference. The first-principles model reduces to the classical models precisely when these partial effects are ignored. Because the classical models can only describe a subset of the topologies described by the first-principles model, the score for the first-principles model gives an upper bound on the performance of the others. This gives us a systematic tool to determine when the classical controllers perform well and when they do not. We construct several representative topologies and report numerical results on the scores obtained by each controller and the first-principles optimal score.
7

Estimation of the probability and uncertainty of undesirable events in large-scale systems / Estimation de la probabilité et l'incertitude des événements indésirables des grands systèmes

Hou, Yunhui 31 March 2016 (has links)
L’objectif de cette thèse est de construire un framework qui représente les incertitudes aléatoires et épistémiques basé sur les approches probabilistes et des théories d’incertain, de comparer les méthodes et de trouver les propres applications sur les grands systèmes avec événement rares. Dans la thèse, une méthode de normalité asymptotique a été proposée avec simulation de Monte Carlo dans les cas binaires ainsi qu'un modèle semi-Markovien dans les cas de systèmes multi-états dynamiques. On a aussi appliqué la théorie d’ensemble aléatoire comme un modèle de base afin d’évaluer la fiabilité et les autres indicateurs de performance dans les systèmes binaires et multi-états avec technique bootstrap. / Our research objective is to build frameworks representing both aleatory and epistemic uncertainties based on probabilistic approach and uncertainty approaches and to compare these methods and find the proper applicatin for these methods in large scale systems with rare event. In this thesis, an asymptotic normality method is proposed with Monte Carlo simulation in case of binary systems as well as semi-Markov model for cases of dynamic multistate system. We also apply random set as a basic model to evaluate system reliability and other performance indices on binary and multistate systems with bootstrap technique.
8

Modelling of input data uncertainty based on random set theory for evaluation of the financial feasibility for hydropower projects / Modellierung unscharfer Eingabeparameter zur Wirtschaftlichkeitsuntersuchung von Wasserkraftprojekten basierend auf Random Set Theorie

Beisler, Matthias Werner 24 August 2011 (has links) (PDF)
The design of hydropower projects requires a comprehensive planning process in order to achieve the objective to maximise exploitation of the existing hydropower potential as well as future revenues of the plant. For this purpose and to satisfy approval requirements for a complex hydropower development, it is imperative at planning stage, that the conceptual development contemplates a wide range of influencing design factors and ensures appropriate consideration of all related aspects. Since the majority of technical and economical parameters that are required for detailed and final design cannot be precisely determined at early planning stages, crucial design parameters such as design discharge and hydraulic head have to be examined through an extensive optimisation process. One disadvantage inherent to commonly used deterministic analysis is the lack of objectivity for the selection of input parameters. Moreover, it cannot be ensured that the entire existing parameter ranges and all possible parameter combinations are covered. Probabilistic methods utilise discrete probability distributions or parameter input ranges to cover the entire range of uncertainties resulting from an information deficit during the planning phase and integrate them into the optimisation by means of an alternative calculation method. The investigated method assists with the mathematical assessment and integration of uncertainties into the rational economic appraisal of complex infrastructure projects. The assessment includes an exemplary verification to what extent the Random Set Theory can be utilised for the determination of input parameters that are relevant for the optimisation of hydropower projects and evaluates possible improvements with respect to accuracy and suitability of the calculated results. / Die Auslegung von Wasserkraftanlagen stellt einen komplexen Planungsablauf dar, mit dem Ziel das vorhandene Wasserkraftpotential möglichst vollständig zu nutzen und künftige, wirtschaftliche Erträge der Kraftanlage zu maximieren. Um dies zu erreichen und gleichzeitig die Genehmigungsfähigkeit eines komplexen Wasserkraftprojektes zu gewährleisten, besteht hierbei die zwingende Notwendigkeit eine Vielzahl für die Konzepterstellung relevanter Einflussfaktoren zu erfassen und in der Projektplanungsphase hinreichend zu berücksichtigen. In frühen Planungsstadien kann ein Großteil der für die Detailplanung entscheidenden, technischen und wirtschaftlichen Parameter meist nicht exakt bestimmt werden, wodurch maßgebende Designparameter der Wasserkraftanlage, wie Durchfluss und Fallhöhe, einen umfangreichen Optimierungsprozess durchlaufen müssen. Ein Nachteil gebräuchlicher, deterministischer Berechnungsansätze besteht in der zumeist unzureichenden Objektivität bei der Bestimmung der Eingangsparameter, sowie der Tatsache, dass die Erfassung der Parameter in ihrer gesamten Streubreite und sämtlichen, maßgeblichen Parameterkombinationen nicht sichergestellt werden kann. Probabilistische Verfahren verwenden Eingangsparameter in ihrer statistischen Verteilung bzw. in Form von Bandbreiten, mit dem Ziel, Unsicherheiten, die sich aus dem in der Planungsphase unausweichlichen Informationsdefizit ergeben, durch Anwendung einer alternativen Berechnungsmethode mathematisch zu erfassen und in die Berechnung einzubeziehen. Die untersuchte Vorgehensweise trägt dazu bei, aus einem Informationsdefizit resultierende Unschärfen bei der wirtschaftlichen Beurteilung komplexer Infrastrukturprojekte objektiv bzw. mathematisch zu erfassen und in den Planungsprozess einzubeziehen. Es erfolgt eine Beurteilung und beispielhafte Überprüfung, inwiefern die Random Set Methode bei Bestimmung der für den Optimierungsprozess von Wasserkraftanlagen relevanten Eingangsgrößen Anwendung finden kann und in wieweit sich hieraus Verbesserungen hinsichtlich Genauigkeit und Aussagekraft der Berechnungsergebnisse ergeben.
9

Modelling of input data uncertainty based on random set theory for evaluation of the financial feasibility for hydropower projects

Beisler, Matthias Werner 25 May 2011 (has links)
The design of hydropower projects requires a comprehensive planning process in order to achieve the objective to maximise exploitation of the existing hydropower potential as well as future revenues of the plant. For this purpose and to satisfy approval requirements for a complex hydropower development, it is imperative at planning stage, that the conceptual development contemplates a wide range of influencing design factors and ensures appropriate consideration of all related aspects. Since the majority of technical and economical parameters that are required for detailed and final design cannot be precisely determined at early planning stages, crucial design parameters such as design discharge and hydraulic head have to be examined through an extensive optimisation process. One disadvantage inherent to commonly used deterministic analysis is the lack of objectivity for the selection of input parameters. Moreover, it cannot be ensured that the entire existing parameter ranges and all possible parameter combinations are covered. Probabilistic methods utilise discrete probability distributions or parameter input ranges to cover the entire range of uncertainties resulting from an information deficit during the planning phase and integrate them into the optimisation by means of an alternative calculation method. The investigated method assists with the mathematical assessment and integration of uncertainties into the rational economic appraisal of complex infrastructure projects. The assessment includes an exemplary verification to what extent the Random Set Theory can be utilised for the determination of input parameters that are relevant for the optimisation of hydropower projects and evaluates possible improvements with respect to accuracy and suitability of the calculated results. / Die Auslegung von Wasserkraftanlagen stellt einen komplexen Planungsablauf dar, mit dem Ziel das vorhandene Wasserkraftpotential möglichst vollständig zu nutzen und künftige, wirtschaftliche Erträge der Kraftanlage zu maximieren. Um dies zu erreichen und gleichzeitig die Genehmigungsfähigkeit eines komplexen Wasserkraftprojektes zu gewährleisten, besteht hierbei die zwingende Notwendigkeit eine Vielzahl für die Konzepterstellung relevanter Einflussfaktoren zu erfassen und in der Projektplanungsphase hinreichend zu berücksichtigen. In frühen Planungsstadien kann ein Großteil der für die Detailplanung entscheidenden, technischen und wirtschaftlichen Parameter meist nicht exakt bestimmt werden, wodurch maßgebende Designparameter der Wasserkraftanlage, wie Durchfluss und Fallhöhe, einen umfangreichen Optimierungsprozess durchlaufen müssen. Ein Nachteil gebräuchlicher, deterministischer Berechnungsansätze besteht in der zumeist unzureichenden Objektivität bei der Bestimmung der Eingangsparameter, sowie der Tatsache, dass die Erfassung der Parameter in ihrer gesamten Streubreite und sämtlichen, maßgeblichen Parameterkombinationen nicht sichergestellt werden kann. Probabilistische Verfahren verwenden Eingangsparameter in ihrer statistischen Verteilung bzw. in Form von Bandbreiten, mit dem Ziel, Unsicherheiten, die sich aus dem in der Planungsphase unausweichlichen Informationsdefizit ergeben, durch Anwendung einer alternativen Berechnungsmethode mathematisch zu erfassen und in die Berechnung einzubeziehen. Die untersuchte Vorgehensweise trägt dazu bei, aus einem Informationsdefizit resultierende Unschärfen bei der wirtschaftlichen Beurteilung komplexer Infrastrukturprojekte objektiv bzw. mathematisch zu erfassen und in den Planungsprozess einzubeziehen. Es erfolgt eine Beurteilung und beispielhafte Überprüfung, inwiefern die Random Set Methode bei Bestimmung der für den Optimierungsprozess von Wasserkraftanlagen relevanten Eingangsgrößen Anwendung finden kann und in wieweit sich hieraus Verbesserungen hinsichtlich Genauigkeit und Aussagekraft der Berechnungsergebnisse ergeben.
10

Non-deterministic analysis of slope stability based on numerical simulation

Shen, Hong 02 October 2012 (has links) (PDF)
In geotechnical engineering, the uncertainties such as the variability and uncertainty inherent in the geotechnical properties have caught more and more attentions from researchers and engineers. They have found that a single “Factor of Safety” calculated by traditional deterministic analyses methods can not represent the slope stability exactly. Recently in order to provide a more rational mathematical framework to incorporate different types of uncertainties in the slope stability estimation, reliability analyses and non-deterministic methods, which include probabilistic and non probabilistic (imprecise methods) methods, have been applied widely. In short, the slope non-deterministic analysis is to combine the probabilistic analysis or non probabilistic analysis with the deterministic slope stability analysis. It cannot be regarded as a completely new slope stability analysis method, but just an extension of the slope deterministic analysis. The slope failure probability calculated by slope non-deterministic analysis is a kind of complement of safety factor. Therefore, the accuracy of non deterministic analysis is not only depended on a suitable probabilistic or non probabilistic analysis method selected, but also on a more rigorous deterministic analysis method or geological model adopted. In this thesis, reliability concepts have been reviewed first, and some typical non-deterministic methods, including Monte Carlo Simulation (MCS), First Order Reliability Method (FORM), Point Estimate Method (PEM) and Random Set Theory (RSM), have been described and successfully applied to the slope stability analysis based on a numerical simulation method-Strength Reduction Method (SRM). All of the processes have been performed in a commercial finite difference code FLAC and a distinct element code UDEC. First of all, as the fundamental of slope reliability analysis, the deterministic numerical simulation method has been improved. This method has a higher accuracy than the conventional limit equilibrium methods, because of the reason that the constitutive relationship of soil is considered, and fewer assumptions on boundary conditions of slope model are necessary. However, the construction of slope numerical models, particularly for the large and complicated models has always been very difficult and it has become an obstacle for application of numerical simulation method. In this study, the excellent spatial analysis function of Geographic Information System (GIS) technique has been introduced to help numerical modeling of the slope. In the process of modeling, the topographic map of slope has been gridded using GIS software, and then the GIS data was transformed into FLAC smoothly through the program built-in language FISH. At last, the feasibility and high efficiency of this technique has been illustrated through a case study-Xuecheng slope, and both 2D and 3D models have been investigated. Subsequently, three most widely used probabilistic analyses methods, Monte Carlo Simulation, First Order Reliability Method and Point Estimate Method applied with Strength Reduction Method have been studied. Monte Carlo Simulation which needs to repeat thousands of deterministic analysis is the most accurate probabilistic method. However it is too time consuming for practical applications, especially when it is combined with numerical simulation method. For reducing the computation effort, a simplified Monte Carlo Simulation-Strength Reduction Method (MCS-SRM) has been developed in this study. This method has estimated the probable failure of slope and calculated the mean value of safety factor by means of soil parameters first, and then calculated the variance of safety factor and reliability of slope according to the assumed probability density function of safety factor. Case studies have confirmed that this method can reduce about 4/5 of time compared with traditional MCS-SRM, and maintain almost the same accuracy. First Order Reliability Method is an approximate method which is based on the Taylor\'s series expansion of performance function. The closed form solution of the partial derivatives of the performance function is necessary to calculate the mean and standard deviation of safety factor. However, there is no explicit performance function in numerical simulation method, so the derivative expressions have been replaced with equivalent difference quotients to solve the differential quotients approximately in this study. Point Estimate Method is also an approximate method involved even fewer calculations than FORM. In the present study, it has been integrated with Strength Reduction Method directly. Another important observation referred to the correlation between the soil parameters cohesion and friction angle. Some authors have found a negative correlation between cohesion and friction angle of soil on the basis of experimental data. However, few slope probabilistic studies are found to consider this negative correlation between soil parameters in literatures. In this thesis, the influence of this correlation on slope probability of failure has been investigated based on numerical simulation method. It was found that a negative correlation considered in the cohesion and friction angle of soil can reduce the variability of safety factor and failure probability of slope, thus increasing the reliability of results. Besides inter-correlation of soil parameters, these are always auto-correlated in space, which is described as spatial variability. For the reason that knowledge on this character is rather limited in literature, it is ignored in geotechnical engineering by most researchers and engineers. In this thesis, the random field method has been introduced in slope numerical simulation to simulate the spatial variability structure, and a numerical procedure for a probabilistic slope stability analysis based on Monte Carlo simulation was presented. The soil properties such as cohesion and friction angle were discretized to continuous random fields based on local averaging method. In the case study, both stationary and non-stationary random fields have been investigated, and the influence of spatial variability and averaging domain on the convergence of numerical simulation and probability of failure was studied. In rock medium, the structure faces have very important influence on the slope stability, and the rock material can be modeled as the combination of rigid or deformable blocks with joints in distinct element method. Therefore, much more input parameters like strength of joints are required to input the rock slope model, which increase the uncertainty of the results of numerical model. Furthermore, because of the limitations of the current laboratory and in-site testes, there is always lack of exact values of geotechnical parameters from rock material, even the probability distribution of these variables. Most of time, engineers can only estimate the interval of these variables from the limit testes or the expertise’s experience. In this study, to assess the reliability of the rock slope, a Random Set Distinct Element Method (RS-DEM) has been developed through coupling of Random Set Theory and Distinct Element Method, and applied in a rock slope in Sichuan province China.

Page generated in 0.4484 seconds