• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 6
  • Tagged with
  • 6
  • 6
  • 6
  • 4
  • 4
  • 4
  • 4
  • 4
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Distributed Random Set Theoretic Soft/Hard Data Fusion

Khaleghi, Bahador January 2012 (has links)
Research on multisensor data fusion aims at providing the enabling technology to combine information from several sources in order to form a unifi ed picture. The literature work on fusion of conventional data provided by non-human (hard) sensors is vast and well-established. In comparison to conventional fusion systems where input data are generated by calibrated electronic sensor systems with well-defi ned characteristics, research on soft data fusion considers combining human-based data expressed preferably in unconstrained natural language form. Fusion of soft and hard data is even more challenging, yet necessary in some applications, and has received little attention in the past. Due to being a rather new area of research, soft/hard data fusion is still in a edging stage with even its challenging problems yet to be adequately de fined and explored. This dissertation develops a framework to enable fusion of both soft and hard data with the Random Set (RS) theory as the underlying mathematical foundation. Random set theory is an emerging theory within the data fusion community that, due to its powerful representational and computational capabilities, is gaining more and more attention among the data fusion researchers. Motivated by the unique characteristics of the random set theory and the main challenge of soft/hard data fusion systems, i.e. the need for a unifying framework capable of processing both unconventional soft data and conventional hard data, this dissertation argues in favor of a random set theoretic approach as the first step towards realizing a soft/hard data fusion framework. Several challenging problems related to soft/hard fusion systems are addressed in the proposed framework. First, an extension of the well-known Kalman lter within random set theory, called Kalman evidential filter (KEF), is adopted as a common data processing framework for both soft and hard data. Second, a novel ontology (syntax+semantics) is developed to allow for modeling soft (human-generated) data assuming target tracking as the application. Third, as soft/hard data fusion is mostly aimed at large networks of information processing, a new approach is proposed to enable distributed estimation of soft, as well as hard data, addressing the scalability requirement of such fusion systems. Fourth, a method for modeling trust in the human agents is developed, which enables the fusion system to protect itself from erroneous/misleading soft data through discounting such data on-the-fly. Fifth, leveraging the recent developments in the RS theoretic data fusion literature a novel soft data association algorithm is developed and deployed to extend the proposed target tracking framework into multi-target tracking case. Finally, the multi-target tracking framework is complemented by introducing a distributed classi fication approach applicable to target classes described with soft human-generated data. In addition, this dissertation presents a novel data-centric taxonomy of data fusion methodologies. In particular, several categories of fusion algorithms have been identifi ed and discussed based on the data-related challenging aspect(s) addressed. It is intended to provide the reader with a generic and comprehensive view of the contemporary data fusion literature, which could also serve as a reference for data fusion practitioners by providing them with conducive design guidelines, in terms of algorithm choice, regarding the specifi c data-related challenges expected in a given application.
2

Distributed Random Set Theoretic Soft/Hard Data Fusion

Khaleghi, Bahador January 2012 (has links)
Research on multisensor data fusion aims at providing the enabling technology to combine information from several sources in order to form a unifi ed picture. The literature work on fusion of conventional data provided by non-human (hard) sensors is vast and well-established. In comparison to conventional fusion systems where input data are generated by calibrated electronic sensor systems with well-defi ned characteristics, research on soft data fusion considers combining human-based data expressed preferably in unconstrained natural language form. Fusion of soft and hard data is even more challenging, yet necessary in some applications, and has received little attention in the past. Due to being a rather new area of research, soft/hard data fusion is still in a edging stage with even its challenging problems yet to be adequately de fined and explored. This dissertation develops a framework to enable fusion of both soft and hard data with the Random Set (RS) theory as the underlying mathematical foundation. Random set theory is an emerging theory within the data fusion community that, due to its powerful representational and computational capabilities, is gaining more and more attention among the data fusion researchers. Motivated by the unique characteristics of the random set theory and the main challenge of soft/hard data fusion systems, i.e. the need for a unifying framework capable of processing both unconventional soft data and conventional hard data, this dissertation argues in favor of a random set theoretic approach as the first step towards realizing a soft/hard data fusion framework. Several challenging problems related to soft/hard fusion systems are addressed in the proposed framework. First, an extension of the well-known Kalman lter within random set theory, called Kalman evidential filter (KEF), is adopted as a common data processing framework for both soft and hard data. Second, a novel ontology (syntax+semantics) is developed to allow for modeling soft (human-generated) data assuming target tracking as the application. Third, as soft/hard data fusion is mostly aimed at large networks of information processing, a new approach is proposed to enable distributed estimation of soft, as well as hard data, addressing the scalability requirement of such fusion systems. Fourth, a method for modeling trust in the human agents is developed, which enables the fusion system to protect itself from erroneous/misleading soft data through discounting such data on-the-fly. Fifth, leveraging the recent developments in the RS theoretic data fusion literature a novel soft data association algorithm is developed and deployed to extend the proposed target tracking framework into multi-target tracking case. Finally, the multi-target tracking framework is complemented by introducing a distributed classi fication approach applicable to target classes described with soft human-generated data. In addition, this dissertation presents a novel data-centric taxonomy of data fusion methodologies. In particular, several categories of fusion algorithms have been identifi ed and discussed based on the data-related challenging aspect(s) addressed. It is intended to provide the reader with a generic and comprehensive view of the contemporary data fusion literature, which could also serve as a reference for data fusion practitioners by providing them with conducive design guidelines, in terms of algorithm choice, regarding the specifi c data-related challenges expected in a given application.
3

Non-deterministic analysis of slope stability based on numerical simulation

Shen, Hong 02 October 2012 (has links) (PDF)
In geotechnical engineering, the uncertainties such as the variability and uncertainty inherent in the geotechnical properties have caught more and more attentions from researchers and engineers. They have found that a single “Factor of Safety” calculated by traditional deterministic analyses methods can not represent the slope stability exactly. Recently in order to provide a more rational mathematical framework to incorporate different types of uncertainties in the slope stability estimation, reliability analyses and non-deterministic methods, which include probabilistic and non probabilistic (imprecise methods) methods, have been applied widely. In short, the slope non-deterministic analysis is to combine the probabilistic analysis or non probabilistic analysis with the deterministic slope stability analysis. It cannot be regarded as a completely new slope stability analysis method, but just an extension of the slope deterministic analysis. The slope failure probability calculated by slope non-deterministic analysis is a kind of complement of safety factor. Therefore, the accuracy of non deterministic analysis is not only depended on a suitable probabilistic or non probabilistic analysis method selected, but also on a more rigorous deterministic analysis method or geological model adopted. In this thesis, reliability concepts have been reviewed first, and some typical non-deterministic methods, including Monte Carlo Simulation (MCS), First Order Reliability Method (FORM), Point Estimate Method (PEM) and Random Set Theory (RSM), have been described and successfully applied to the slope stability analysis based on a numerical simulation method-Strength Reduction Method (SRM). All of the processes have been performed in a commercial finite difference code FLAC and a distinct element code UDEC. First of all, as the fundamental of slope reliability analysis, the deterministic numerical simulation method has been improved. This method has a higher accuracy than the conventional limit equilibrium methods, because of the reason that the constitutive relationship of soil is considered, and fewer assumptions on boundary conditions of slope model are necessary. However, the construction of slope numerical models, particularly for the large and complicated models has always been very difficult and it has become an obstacle for application of numerical simulation method. In this study, the excellent spatial analysis function of Geographic Information System (GIS) technique has been introduced to help numerical modeling of the slope. In the process of modeling, the topographic map of slope has been gridded using GIS software, and then the GIS data was transformed into FLAC smoothly through the program built-in language FISH. At last, the feasibility and high efficiency of this technique has been illustrated through a case study-Xuecheng slope, and both 2D and 3D models have been investigated. Subsequently, three most widely used probabilistic analyses methods, Monte Carlo Simulation, First Order Reliability Method and Point Estimate Method applied with Strength Reduction Method have been studied. Monte Carlo Simulation which needs to repeat thousands of deterministic analysis is the most accurate probabilistic method. However it is too time consuming for practical applications, especially when it is combined with numerical simulation method. For reducing the computation effort, a simplified Monte Carlo Simulation-Strength Reduction Method (MCS-SRM) has been developed in this study. This method has estimated the probable failure of slope and calculated the mean value of safety factor by means of soil parameters first, and then calculated the variance of safety factor and reliability of slope according to the assumed probability density function of safety factor. Case studies have confirmed that this method can reduce about 4/5 of time compared with traditional MCS-SRM, and maintain almost the same accuracy. First Order Reliability Method is an approximate method which is based on the Taylor\'s series expansion of performance function. The closed form solution of the partial derivatives of the performance function is necessary to calculate the mean and standard deviation of safety factor. However, there is no explicit performance function in numerical simulation method, so the derivative expressions have been replaced with equivalent difference quotients to solve the differential quotients approximately in this study. Point Estimate Method is also an approximate method involved even fewer calculations than FORM. In the present study, it has been integrated with Strength Reduction Method directly. Another important observation referred to the correlation between the soil parameters cohesion and friction angle. Some authors have found a negative correlation between cohesion and friction angle of soil on the basis of experimental data. However, few slope probabilistic studies are found to consider this negative correlation between soil parameters in literatures. In this thesis, the influence of this correlation on slope probability of failure has been investigated based on numerical simulation method. It was found that a negative correlation considered in the cohesion and friction angle of soil can reduce the variability of safety factor and failure probability of slope, thus increasing the reliability of results. Besides inter-correlation of soil parameters, these are always auto-correlated in space, which is described as spatial variability. For the reason that knowledge on this character is rather limited in literature, it is ignored in geotechnical engineering by most researchers and engineers. In this thesis, the random field method has been introduced in slope numerical simulation to simulate the spatial variability structure, and a numerical procedure for a probabilistic slope stability analysis based on Monte Carlo simulation was presented. The soil properties such as cohesion and friction angle were discretized to continuous random fields based on local averaging method. In the case study, both stationary and non-stationary random fields have been investigated, and the influence of spatial variability and averaging domain on the convergence of numerical simulation and probability of failure was studied. In rock medium, the structure faces have very important influence on the slope stability, and the rock material can be modeled as the combination of rigid or deformable blocks with joints in distinct element method. Therefore, much more input parameters like strength of joints are required to input the rock slope model, which increase the uncertainty of the results of numerical model. Furthermore, because of the limitations of the current laboratory and in-site testes, there is always lack of exact values of geotechnical parameters from rock material, even the probability distribution of these variables. Most of time, engineers can only estimate the interval of these variables from the limit testes or the expertise’s experience. In this study, to assess the reliability of the rock slope, a Random Set Distinct Element Method (RS-DEM) has been developed through coupling of Random Set Theory and Distinct Element Method, and applied in a rock slope in Sichuan province China.
4

Non-deterministic analysis of slope stability based on numerical simulation

Shen, Hong 29 June 2012 (has links)
In geotechnical engineering, the uncertainties such as the variability and uncertainty inherent in the geotechnical properties have caught more and more attentions from researchers and engineers. They have found that a single “Factor of Safety” calculated by traditional deterministic analyses methods can not represent the slope stability exactly. Recently in order to provide a more rational mathematical framework to incorporate different types of uncertainties in the slope stability estimation, reliability analyses and non-deterministic methods, which include probabilistic and non probabilistic (imprecise methods) methods, have been applied widely. In short, the slope non-deterministic analysis is to combine the probabilistic analysis or non probabilistic analysis with the deterministic slope stability analysis. It cannot be regarded as a completely new slope stability analysis method, but just an extension of the slope deterministic analysis. The slope failure probability calculated by slope non-deterministic analysis is a kind of complement of safety factor. Therefore, the accuracy of non deterministic analysis is not only depended on a suitable probabilistic or non probabilistic analysis method selected, but also on a more rigorous deterministic analysis method or geological model adopted. In this thesis, reliability concepts have been reviewed first, and some typical non-deterministic methods, including Monte Carlo Simulation (MCS), First Order Reliability Method (FORM), Point Estimate Method (PEM) and Random Set Theory (RSM), have been described and successfully applied to the slope stability analysis based on a numerical simulation method-Strength Reduction Method (SRM). All of the processes have been performed in a commercial finite difference code FLAC and a distinct element code UDEC. First of all, as the fundamental of slope reliability analysis, the deterministic numerical simulation method has been improved. This method has a higher accuracy than the conventional limit equilibrium methods, because of the reason that the constitutive relationship of soil is considered, and fewer assumptions on boundary conditions of slope model are necessary. However, the construction of slope numerical models, particularly for the large and complicated models has always been very difficult and it has become an obstacle for application of numerical simulation method. In this study, the excellent spatial analysis function of Geographic Information System (GIS) technique has been introduced to help numerical modeling of the slope. In the process of modeling, the topographic map of slope has been gridded using GIS software, and then the GIS data was transformed into FLAC smoothly through the program built-in language FISH. At last, the feasibility and high efficiency of this technique has been illustrated through a case study-Xuecheng slope, and both 2D and 3D models have been investigated. Subsequently, three most widely used probabilistic analyses methods, Monte Carlo Simulation, First Order Reliability Method and Point Estimate Method applied with Strength Reduction Method have been studied. Monte Carlo Simulation which needs to repeat thousands of deterministic analysis is the most accurate probabilistic method. However it is too time consuming for practical applications, especially when it is combined with numerical simulation method. For reducing the computation effort, a simplified Monte Carlo Simulation-Strength Reduction Method (MCS-SRM) has been developed in this study. This method has estimated the probable failure of slope and calculated the mean value of safety factor by means of soil parameters first, and then calculated the variance of safety factor and reliability of slope according to the assumed probability density function of safety factor. Case studies have confirmed that this method can reduce about 4/5 of time compared with traditional MCS-SRM, and maintain almost the same accuracy. First Order Reliability Method is an approximate method which is based on the Taylor\'s series expansion of performance function. The closed form solution of the partial derivatives of the performance function is necessary to calculate the mean and standard deviation of safety factor. However, there is no explicit performance function in numerical simulation method, so the derivative expressions have been replaced with equivalent difference quotients to solve the differential quotients approximately in this study. Point Estimate Method is also an approximate method involved even fewer calculations than FORM. In the present study, it has been integrated with Strength Reduction Method directly. Another important observation referred to the correlation between the soil parameters cohesion and friction angle. Some authors have found a negative correlation between cohesion and friction angle of soil on the basis of experimental data. However, few slope probabilistic studies are found to consider this negative correlation between soil parameters in literatures. In this thesis, the influence of this correlation on slope probability of failure has been investigated based on numerical simulation method. It was found that a negative correlation considered in the cohesion and friction angle of soil can reduce the variability of safety factor and failure probability of slope, thus increasing the reliability of results. Besides inter-correlation of soil parameters, these are always auto-correlated in space, which is described as spatial variability. For the reason that knowledge on this character is rather limited in literature, it is ignored in geotechnical engineering by most researchers and engineers. In this thesis, the random field method has been introduced in slope numerical simulation to simulate the spatial variability structure, and a numerical procedure for a probabilistic slope stability analysis based on Monte Carlo simulation was presented. The soil properties such as cohesion and friction angle were discretized to continuous random fields based on local averaging method. In the case study, both stationary and non-stationary random fields have been investigated, and the influence of spatial variability and averaging domain on the convergence of numerical simulation and probability of failure was studied. In rock medium, the structure faces have very important influence on the slope stability, and the rock material can be modeled as the combination of rigid or deformable blocks with joints in distinct element method. Therefore, much more input parameters like strength of joints are required to input the rock slope model, which increase the uncertainty of the results of numerical model. Furthermore, because of the limitations of the current laboratory and in-site testes, there is always lack of exact values of geotechnical parameters from rock material, even the probability distribution of these variables. Most of time, engineers can only estimate the interval of these variables from the limit testes or the expertise’s experience. In this study, to assess the reliability of the rock slope, a Random Set Distinct Element Method (RS-DEM) has been developed through coupling of Random Set Theory and Distinct Element Method, and applied in a rock slope in Sichuan province China.
5

Modelling of input data uncertainty based on random set theory for evaluation of the financial feasibility for hydropower projects / Modellierung unscharfer Eingabeparameter zur Wirtschaftlichkeitsuntersuchung von Wasserkraftprojekten basierend auf Random Set Theorie

Beisler, Matthias Werner 24 August 2011 (has links) (PDF)
The design of hydropower projects requires a comprehensive planning process in order to achieve the objective to maximise exploitation of the existing hydropower potential as well as future revenues of the plant. For this purpose and to satisfy approval requirements for a complex hydropower development, it is imperative at planning stage, that the conceptual development contemplates a wide range of influencing design factors and ensures appropriate consideration of all related aspects. Since the majority of technical and economical parameters that are required for detailed and final design cannot be precisely determined at early planning stages, crucial design parameters such as design discharge and hydraulic head have to be examined through an extensive optimisation process. One disadvantage inherent to commonly used deterministic analysis is the lack of objectivity for the selection of input parameters. Moreover, it cannot be ensured that the entire existing parameter ranges and all possible parameter combinations are covered. Probabilistic methods utilise discrete probability distributions or parameter input ranges to cover the entire range of uncertainties resulting from an information deficit during the planning phase and integrate them into the optimisation by means of an alternative calculation method. The investigated method assists with the mathematical assessment and integration of uncertainties into the rational economic appraisal of complex infrastructure projects. The assessment includes an exemplary verification to what extent the Random Set Theory can be utilised for the determination of input parameters that are relevant for the optimisation of hydropower projects and evaluates possible improvements with respect to accuracy and suitability of the calculated results. / Die Auslegung von Wasserkraftanlagen stellt einen komplexen Planungsablauf dar, mit dem Ziel das vorhandene Wasserkraftpotential möglichst vollständig zu nutzen und künftige, wirtschaftliche Erträge der Kraftanlage zu maximieren. Um dies zu erreichen und gleichzeitig die Genehmigungsfähigkeit eines komplexen Wasserkraftprojektes zu gewährleisten, besteht hierbei die zwingende Notwendigkeit eine Vielzahl für die Konzepterstellung relevanter Einflussfaktoren zu erfassen und in der Projektplanungsphase hinreichend zu berücksichtigen. In frühen Planungsstadien kann ein Großteil der für die Detailplanung entscheidenden, technischen und wirtschaftlichen Parameter meist nicht exakt bestimmt werden, wodurch maßgebende Designparameter der Wasserkraftanlage, wie Durchfluss und Fallhöhe, einen umfangreichen Optimierungsprozess durchlaufen müssen. Ein Nachteil gebräuchlicher, deterministischer Berechnungsansätze besteht in der zumeist unzureichenden Objektivität bei der Bestimmung der Eingangsparameter, sowie der Tatsache, dass die Erfassung der Parameter in ihrer gesamten Streubreite und sämtlichen, maßgeblichen Parameterkombinationen nicht sichergestellt werden kann. Probabilistische Verfahren verwenden Eingangsparameter in ihrer statistischen Verteilung bzw. in Form von Bandbreiten, mit dem Ziel, Unsicherheiten, die sich aus dem in der Planungsphase unausweichlichen Informationsdefizit ergeben, durch Anwendung einer alternativen Berechnungsmethode mathematisch zu erfassen und in die Berechnung einzubeziehen. Die untersuchte Vorgehensweise trägt dazu bei, aus einem Informationsdefizit resultierende Unschärfen bei der wirtschaftlichen Beurteilung komplexer Infrastrukturprojekte objektiv bzw. mathematisch zu erfassen und in den Planungsprozess einzubeziehen. Es erfolgt eine Beurteilung und beispielhafte Überprüfung, inwiefern die Random Set Methode bei Bestimmung der für den Optimierungsprozess von Wasserkraftanlagen relevanten Eingangsgrößen Anwendung finden kann und in wieweit sich hieraus Verbesserungen hinsichtlich Genauigkeit und Aussagekraft der Berechnungsergebnisse ergeben.
6

Modelling of input data uncertainty based on random set theory for evaluation of the financial feasibility for hydropower projects

Beisler, Matthias Werner 25 May 2011 (has links)
The design of hydropower projects requires a comprehensive planning process in order to achieve the objective to maximise exploitation of the existing hydropower potential as well as future revenues of the plant. For this purpose and to satisfy approval requirements for a complex hydropower development, it is imperative at planning stage, that the conceptual development contemplates a wide range of influencing design factors and ensures appropriate consideration of all related aspects. Since the majority of technical and economical parameters that are required for detailed and final design cannot be precisely determined at early planning stages, crucial design parameters such as design discharge and hydraulic head have to be examined through an extensive optimisation process. One disadvantage inherent to commonly used deterministic analysis is the lack of objectivity for the selection of input parameters. Moreover, it cannot be ensured that the entire existing parameter ranges and all possible parameter combinations are covered. Probabilistic methods utilise discrete probability distributions or parameter input ranges to cover the entire range of uncertainties resulting from an information deficit during the planning phase and integrate them into the optimisation by means of an alternative calculation method. The investigated method assists with the mathematical assessment and integration of uncertainties into the rational economic appraisal of complex infrastructure projects. The assessment includes an exemplary verification to what extent the Random Set Theory can be utilised for the determination of input parameters that are relevant for the optimisation of hydropower projects and evaluates possible improvements with respect to accuracy and suitability of the calculated results. / Die Auslegung von Wasserkraftanlagen stellt einen komplexen Planungsablauf dar, mit dem Ziel das vorhandene Wasserkraftpotential möglichst vollständig zu nutzen und künftige, wirtschaftliche Erträge der Kraftanlage zu maximieren. Um dies zu erreichen und gleichzeitig die Genehmigungsfähigkeit eines komplexen Wasserkraftprojektes zu gewährleisten, besteht hierbei die zwingende Notwendigkeit eine Vielzahl für die Konzepterstellung relevanter Einflussfaktoren zu erfassen und in der Projektplanungsphase hinreichend zu berücksichtigen. In frühen Planungsstadien kann ein Großteil der für die Detailplanung entscheidenden, technischen und wirtschaftlichen Parameter meist nicht exakt bestimmt werden, wodurch maßgebende Designparameter der Wasserkraftanlage, wie Durchfluss und Fallhöhe, einen umfangreichen Optimierungsprozess durchlaufen müssen. Ein Nachteil gebräuchlicher, deterministischer Berechnungsansätze besteht in der zumeist unzureichenden Objektivität bei der Bestimmung der Eingangsparameter, sowie der Tatsache, dass die Erfassung der Parameter in ihrer gesamten Streubreite und sämtlichen, maßgeblichen Parameterkombinationen nicht sichergestellt werden kann. Probabilistische Verfahren verwenden Eingangsparameter in ihrer statistischen Verteilung bzw. in Form von Bandbreiten, mit dem Ziel, Unsicherheiten, die sich aus dem in der Planungsphase unausweichlichen Informationsdefizit ergeben, durch Anwendung einer alternativen Berechnungsmethode mathematisch zu erfassen und in die Berechnung einzubeziehen. Die untersuchte Vorgehensweise trägt dazu bei, aus einem Informationsdefizit resultierende Unschärfen bei der wirtschaftlichen Beurteilung komplexer Infrastrukturprojekte objektiv bzw. mathematisch zu erfassen und in den Planungsprozess einzubeziehen. Es erfolgt eine Beurteilung und beispielhafte Überprüfung, inwiefern die Random Set Methode bei Bestimmung der für den Optimierungsprozess von Wasserkraftanlagen relevanten Eingangsgrößen Anwendung finden kann und in wieweit sich hieraus Verbesserungen hinsichtlich Genauigkeit und Aussagekraft der Berechnungsergebnisse ergeben.

Page generated in 0.0649 seconds