• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 10
  • 2
  • 2
  • Tagged with
  • 18
  • 18
  • 6
  • 5
  • 5
  • 5
  • 4
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.

Multilevel Design Optimization and the Effect of Epistemic Uncertainty

Nesbit, Benjamin Edward 13 December 2014 (has links)
This work presents the state of the art in hierarchically decomposed multilevel optimization. This work is expanded with the inclusion of evidence theory with the multilevel framework for the quantification of epistemic uncertainty. The novel method, Evidence-Based Multilevel Design optimization, is then used to solve two analytical optimization problems. This method is also used to explore the effect of the belief structure on the final solution. A methodology is presented to reduce the costs of evidence-based optimization through manipulation of the belief structure. In addition, a transport aircraft wing is also solved with multilevel optimization without uncertainty. This complex, real world optimization problem shows the capability of decomposed multilevel framework to reduce costs of solving computationally expensive problems with black box analyses.

Cross-scale model validation with aleatory and epistemic uncertainty

Blumer, Joel David 08 June 2015 (has links)
Nearly every decision must be made with a degree of uncertainty regarding the outcome. Decision making based on modeling and simulation predictions needs to incorporate and aggregate uncertain evidence. To validate multiscale simulation models, it may be necessary to consider evidence collected at a length scale that is different from the one at which a model predicts. In addition, traditional methods of uncertainty analysis do not distinguish between two types of uncertainty: uncertainty due to inherently random inputs, and uncertainty due to lack of information about the inputs. This thesis examines and applies a Bayesian approach for model parameter validation that uses generalized interval probability to separate these two types of uncertainty. A generalized interval Bayes’ rule (GIBR) is used to combine the evidence and update belief in the validity of parameters. The sensitivity of completeness and soundness for interval range estimation in GIBR is investigated. Several approaches to represent complete ignorance of probabilities’ values are tested. The result from the GIBR method is verified using Monte Carlo simulations. The method is first applied to validate the parameter set for a molecular dynamics simulation of defect formation due to radiation. Evidence is supplied by the comparison with physical experiments. Because the simulation includes variables whose effects are not directly observable, an expanded form of GIBR is implemented to incorporate the uncertainty associated with measurement in belief update. In a second example, the proposed method is applied to combining the evidence from two models of crystal plasticity at different length scales.

A methodology for uncertainty quantification in quantitative technology valuation based on expert elicitation

Akram, Muhammad Farooq 28 March 2012 (has links)
The management of technology portfolios is an important element of aerospace system design. New technologies are often applied to new product designs to ensure their competitiveness at the time they are introduced to market. The future performance of yet-to-be designed components is inherently uncertain, necessitating subject matter expert knowledge, statistical methods and financial forecasting. Estimates of the appropriate parameter settings often come from disciplinary experts, who may disagree with each other because of varying experience and background. Due to inherent uncertain nature of expert elicitation in technology valuation process, appropriate uncertainty quantification and propagation is very critical. The uncertainty in defining the impact of an input on performance parameters of a system, make it difficult to use traditional probability theory. Often the available information is not enough to assign the appropriate probability distributions to uncertain inputs. Another problem faced during technology elicitation pertains to technology interactions in a portfolio. When multiple technologies are applied simultaneously on a system, often their cumulative impact is non-linear. Current methods assume that technologies are either incompatible or linearly independent. It is observed that in case of lack of knowledge about the problem, epistemic uncertainty is most suitable representation of the process. It reduces the number of assumptions during the elicitation process, when experts are forced to assign probability distributions to their opinions without sufficient knowledge. Epistemic uncertainty can be quantified by many techniques. In present research it is proposed that interval analysis and Dempster-Shafer theory of evidence are better suited for quantification of epistemic uncertainty in technology valuation process. Proposed technique seeks to offset some of the problems faced by using deterministic or traditional probabilistic approaches for uncertainty propagation. Non-linear behavior in technology interactions is captured through expert elicitation based technology synergy matrices (TSM). Proposed TSMs increase the fidelity of current technology forecasting methods by including higher order technology interactions. A test case for quantification of epistemic uncertainty on a large scale problem of combined cycle power generation system was selected. A detailed multidisciplinary modeling and simulation environment was adopted for this problem. Results have shown that evidence theory based technique provides more insight on the uncertainties arising from incomplete information or lack of knowledge as compared to deterministic or probability theory methods. Margin analysis was also carried out for both the techniques. A detailed description of TSMs and their usage in conjunction with technology impact matrices and technology compatibility matrices is discussed. Various combination methods are also proposed for higher order interactions, which can be applied according to the expert opinion or historical data. The introduction of technology synergy matrix enabled capturing the higher order technology interactions, and improvement in predicted system performance.

Evaluation of Epistemic Uncertainties in Probabilistic Risk Assessments : Philosophical Review of Epistemic Uncertainties in Probabilistic Risk Assessment Models Applied to Nuclear Power Plants - Fukushima Daiichi Accident as a Case Study

Rawandi, Omed A. January 2020 (has links)
Safety and risk assessment are key priorities for nuclear power plants. Probabilistic risk assessment (PRA) is a method for quantitative evaluation of accident risk, in particular severe nuclear core damage and the associated release of radioactive materials into the environment. The reliability and certainty of PRA have at times been questioned, especially when real-world observations have indicated that the frequency of nuclear accidents is higher than the probabilities predicted by PRA. This thesis provides a philosophical review of the epistemic uncertainties in PRA, using the Fukushima Daiichi accident of March 2011 as a case study. The thesis provides an overview of the PRA model structure, its key elements, and possible sources of uncertainty, in an attempt to understand the deviation between the real frequency of nuclear core-melt accidents and the probabilities predicted by PRA.The analyses in this thesis address several sources of epistemic uncertainty in PRA. Analyses of the PRA approach reveal the difficulty involved in covering all possible initiating events, all component and system failures, as well as their possible combinations in the risk evaluations. This difficulty represents the source of a characteristic epistemic uncertainty, referred to as completeness uncertainty. Analyses from the case study (the Fukushima Daiichi accident) illustrate this difficulty, as the PRA failed to identify a combined earthquake and tsunami, with the resultant flooding and consequent power failure and total blackout, as an initiating causal event in its logic structure.The analyses further demonstrate how insufficient experience and knowledge, as well as a lack of empirical data, lead to incorrect assumptions, which are used by the model as input parameters to estimate the probabilities of accidents. With limited availability of input data, decision-makers rely upon the subjective judgements and individual experiences of experts, which adds a further source of epistemic uncertainty to the PRA, usually referred to as input parameter uncertainty. As a typical example from the case study, the Fukushima Daiichi accident revealed that the PRA had underestimated the height of a possible tsunami. Consequently, the risk mitigation systems (e.g. the barrier seawalls) built to protect the power plant were inadequate due to incorrect input data.Poor assumptions may also result in improper modeling of failure modes and sequences in the PRA logic structure, which makes room for an additional source of epistemic uncertainty referred to as model uncertainty. For instance, the Fukushima Daiichi accident indicated insufficient backup of the power supply, because the possibility of simultaneous failure of several emergency diesel generators was assumed to be negligibly small. However, that was exactly what happened when 12 out of the 13 generators failed at the same time as a result of flooding.Furthermore, the analyses highlight the difficulty of modeling the human interventions and actions, in particular during the course of unexpected accidents, taking into account the physiological and psychological effects on the cognitive performance of humans, which result in uncertain operator interventions. This represents an additional source of epistemic uncertainty, usually referred to as uncertainty in modeling human interventions. As a result, there may be an increase in the probability of human error, characterized by a delay in making a diagnosis, formulating a response and taking action. Even this statement confirms the complexity of modelling human errors. In the case of the Fukushima Daiichi accident, lack ofvsufficient instructions for dealing with this "unexpected" accident made the coordination of operators' interventions almost impossible.Given the existence of all these sources of epistemic uncertainty, it would be reasonable to expect such a detected deviation between the real frequency of nuclear core-melt accidents and the probabilities predicted by PRA.It is, however, important to highlight that the occurrence of the Fukushima Daiichi accident could lie within the uncertainty distribution that the PRA model predicted prior to the accident. Hence, from the probabilistic point of view, the occurrence of a single unexpected accident should be interpreted with care, especially in political and commercial debates. Despite the limitations that have been highlighted in this thesis, the model still can provide valuable insights for systematic examination of safety systems, risk mitigation approaches, and strategic plans aimed at protecting the nuclear power plants against failures. Nevertheless, the PRA model does have development potentials, which deserves serious attention. The validity of calculated frequencies in PRA is restricted to the parameter under study. This validity can be improved by adding further relevant scenarios to the PRA, improving the screening approaches and collecting more input data through better collaboration between nuclear power plants world-wide. Lessons learned from the Fukushima Daiichi accident have initiated further studies aimed at covering additional scenarios. In subsequent IAEA safety report series, external hazards in multi-unit nuclear power plants have been considered. Such an action shows that PRA is a dynamic approach that needs continuous improvement toward better reliability.

Analysis of Transient Overpower Scenarios in Sodium Fast Reactors

Grabaskas, David 20 August 2010 (has links)
No description available.

Thick Concepts in Practice : Normative Aspects of Risk and Safety

Möller, Niklas January 2009 (has links)
The thesis aims at analyzing the concepts of risk and safety as well as the class of concepts to which they belong, thick concepts, focusing in particular on the normative aspects involved. Essay I analyzes thick concepts, i.e. concepts such as cruelty and kindness that seem to combine descriptive and evaluative features. The traditional account, in which thick concepts are analyzed as the conjunction of a factual description and an evaluation, is criticized. Instead, it is argued that the descriptive and evaluative aspects must be understood as a whole. Furthermore, it is argued that the two main worries evoked against non-naturalism – that non-naturalism cannot account for disagreement and that it is not genuinely explanatory – can be met. Essay II investigates the utilization of the Kripke/Putnam causal theory of reference in relation to the Open Question Argument. It is argued that the Open Question Argument suitably interpreted provides prima facie evidence against the claim that moral kinds are natural kinds, and that the causal theory, as interpreted by leading naturalist defenders, actually underscores this conclusion. Essay III utilizes the interpretation of the Open Question Argument argued for in the previous essay in order to argue against naturalistic reduction of risk, i.e. reduction of risk into natural concepts such as probability and harm. Three different normative aspects of risk and safety are put forward – epistemic uncertainty, distributive normativity and border normativity – and it is argued that these normative aspects cannot be reduced to a natural measure. Essay IV provides a conceptual analysis of safety in the context of societal decision-making, and argues for a notion that explicitly includes epistemic uncertainty, the degree to which we are uncertain of our knowledge of the situation at hand. Some formal versions of a comparative safety concept are also proposed. Essay V puts forward a normative critique against a common argument, viz. the claim that the public should follow the experts’ advice in recommending an activity whenever the experts have the best knowledge of the risk involved. The importance of safety in risk acceptance together with considerations from epistemic uncertainty makes the claim incorrect even after including plausible limitations to exclude ‘external’ considerations. Furthermore, it is shown that the scope of the objection covers risk assessment as well as risk management. Essay VI provides a systematized account of safety engineering practices that clarifies their relation to the goal of safety engineering, namely to increase safety. A list of 24 principles referred to in the literature of safety engineering is provided, divided into four major categories. It is argued that important aspects of these methods can be better understood with the help of the distinction between risk and uncertainty, in addition to the common distinction between risk and probability. / QC 20100803

Selection for Rapid Manufacturing under Epistemic Uncertainty

Wilson, Jamal Omari 17 April 2006 (has links)
Rapid Prototyping (RP) is the process of building three-dimensional objects, in layers, using additive manufacturing. Rapid Manufacturing (RM) is the use of RP technologies to manufacture end-use, or finished, products. At small lot sizes, such as with customized products, traditional manufacturing technologies become infeasible due to the high costs of tooling and setup. RM offers the opportunity to produce these customized products economically. Coupled with the customization opportunities afforded by RM is a certain degree of uncertainty. This uncertainty is mainly attributed to the lack of information known about what the customers specific requirements and preferences are at the time of production. In this thesis, the author presents an overall method for selection of a RM technology, as an investment decision, under the geometric uncertainty inherent to mass customization. Specifically, the author defines the types of uncertainty inherent to RM (epistemic), proposes a method to account for this uncertainty in a selection process (interval analysis), and proposes a method to select a technology under uncertainty (Decision Theory under strict uncertainty). The author illustrates the method with examples on the selection of an RM technology to produce custom caster wheels and custom hearing aid shells. In addition to the selection methodology, the author also develops universal build time and part cost models for the RM technologies. These models are universal in the sense that they depend explicitly on the parameters that characterize each technology and the overall part characteristics.

A multi-fidelity analysis selection method using a constrained discrete optimization formulation

Stults, Ian Collier 17 August 2009 (has links)
The purpose of this research is to develop a method for selecting the fidelity of contributing analyses in computer simulations. Model uncertainty is a significant component of result validity, yet it is neglected in most conceptual design studies. When it is considered, it is done so in only a limited fashion, and therefore brings the validity of selections made based on these results into question. Neglecting model uncertainty can potentially cause costly redesigns of concepts later in the design process or can even cause program cancellation. Rather than neglecting it, if one were to instead not only realize the model uncertainty in tools being used but also use this information to select the tools for a contributing analysis, studies could be conducted more efficiently and trust in results could be quantified. Methods for performing this are generally not rigorous or traceable, and in many cases the improvement and additional time spent performing enhanced calculations are washed out by less accurate calculations performed downstream. The intent of this research is to resolve this issue by providing a method that will minimize the amount of time spent conducting computer simulations while meeting accuracy and concept resolution requirements for results.

Intégration des effets de site dans les méthodes d'estimation probabiliste de l'aléa sismique / Integration of Site Effects into Probabilistic Seismic Hazard Assessment.Integration of site effects into probabilistic seismic hazard methods.

Aristizabal, Claudia 19 March 2018 (has links)
Les travaux de cette thèse s'inscrivent dans l'objectif général de fournir des recommandations sur la façon d'intégrer les effets du site dans l'évaluation probabiliste des risques sismiques, mieux connue sous le nom de PSHA, une méthodologie connue et utilisée à l'échelle mondiale pour estimer l'aléa et le risque sismiques à l'échelle régionale et locale. Nous passons donc en revue les méthodes disponibles dans la littérature pour obtenir la courbe d'aléa sismique en surface d'un site non-rocheux, en commençant par les méthodes les plus simples et plus génériques (partiellement probabiliste), jusqu'aux méthodes site-spécifiques (partiellement et entièrement probabilistes) qui nécessitent une caractérisation du site de plus en plus poussée, rarement disponible sauf cas exceptionnel comme par exemple le site test d'Euroseistest. C'est justement sur l'exemple de ce site que sont donc comparées un certain nombre de ces méthodes, ainsi qu'une nouvelle.La spécificité et la difficulté de ces études PSHA "site-spécifiques" vient du caractère non-linéaire de la réponse des sites peu rigides, ainsi que du fait que le rocher de référence contrôlant cette réponse est souvent très rigide. Les aspects "ajustement rocher dur" et "convolution" de l'aléa sismique au rocher avec la fonction d'amplification ou la fonction transfert (empirique ou numérique) d’un site font donc l'objet d'une attention particulière dans ces études comparatives. Un cadre général est présenté sur la façon de prendre en compte simultanément les caractéristiques spécifiques au site, la variabilité aléatoire complète ou réduite ("single station sigma"), les ajustements hôte-cible et le comportement linéaire / non linéaire d'un site, où nous expliquons toutes les étapes, corrections, avantages et difficultés que nous avons trouvés dans le processus et les différentes façons de les mettre en oeuvre.Cette étude comparative est divisée en deux parties: la première porte sur les méthodes non site-spécifiques et les méthodes hybrides site-spécifique (évaluation probabiliste de l'aléa au rocher et déterministe de la réponse de site), la seconde porte sur deux approches prenant en compte la convolution aléa rocher / réponse de site de façon probabiliste. Un des résultats majeurs de la première est l'augmentation de l'incertitude épistémique sur l'aléa en site meuble comparé à l'aléa au rocher, en raison du cumul des incertitudes associées à chaque étape. Un autre résultat majeur commun aux deux études est l'impact très important de la non-linéarité du sol dans les sites souples, ainsi que de la façon de les prendre en compte: la variabilité liée à l'utilisation de différents codes de simulation NL apparaît plus importante que la variabilité liée à différentes méthodes de convolution 100% probabilistes. Nous soulignons l'importance d'améliorer la manier d’inclure les effets du site dans les méthodes de l’estimation de l’aléa sismique probabiliste ou PSHA, et nous soulignons aussi l'importance d'instrumenter des sites actifs avec des sédiments meubles, comme l'Euroseistest, afin de tester et valider les modèles numériques.Finalement, on présente un résumé des résultats, des conclusions générales, de la discussion sur les principaux problèmes méthodologiques et des perspectives d'amélioration et de travail futur.Mots-clés: Effets du site, incertitude épistémique, PSHA, single station sigma, ajustements hôte-cible, effets linéaires et non linéaires, réponse de site / The overall goal of this research work is of provide recommendations on how to integrate site effects into Probabilistic Seismic Hazard Assessment, better known as PSHA, a well-known and widely used methodology. Globally used to estimate seismic hazard and risk at regional and local scales. We therefore review the methods available in the literature to obtain the seismic hazard curve at the surface of a soft soil site, starting with the simplest and most generic methods (partially probabilistic), up to the full site-specific methods (partially and fully probabilistic), requiring an excellent site-specific characterization, rarely available except exceptional cases such as the case of Euroseistest site. It is precisely on the example of this site that are compared a number of these methods, as well as a new one. And it is precisely at the Euroseistest that we performed an example of application of the different methods as well as a new one that we propose as a result of this work.The specificity and difficulty of these "site-specific" PSHA studies comes from the non-linear nature of the response of the soft sites, as well as from the fact that the reference rock controlling this response is often very rigid. The "rock to hard rock adjustment" and "convolution" aspects of the rock seismic hazard, together with the amplification function or the transfer function (empirical or numerical) of a site are therefore the subject of particular attention in these studies. comparative studies. A general framework is presented on how to simultaneously take into account the site-specific characteristics, such as the complete or reduced random variability ("single station sigma"), host-to -target adjustments and the linear / nonlinear behavior of a site, where we explain all the followed steps, the different corrections performed, the benefits and difficulties that we found in the process and the ways we sort them and discussing them when the answer was not straight forward.This comparative study is divided into two parts: the first deals with non-site-specific methods and site-specific hybrid methods (probabilistic evaluation of rock hazard and deterministic of the site response). The second deals with two approaches taking into account the convolution of rock hazard and the site response in a probabilistically way. One of the major results of the first is the increase of the epistemic uncertainty on the soft site hazard compared to the rock hazard, due to acumulation of uncertainties associated to each step. Another major common result to both studies is the very important impact of non-linearity on soft sites, as well as the complexity on how to account for them: the variability associated with the use of different non-linear simulation codes appears to be greater than the method-to-method variability associated with the two different full convolution probabilistic methods. We emphasize on the importance of improving the way in which the site effects are included into probabilistic seismic hazard methods, PSHA. And we also emphasize on the importance of instrumenting active sites with soft sediments, such as the Euroseistest, to test and validate numerical models.Finally, a summary of the results, the general conclusions, discussion of key methodological issues, and perspectives for improvement and future work are presented.Keywords: Site Effects, Epistemic Uncertainty, PSHA, single station sigma, host to target adjustments, linear and nonlinear site effects, soil site response.

Estimation of the probability and uncertainty of undesirable events in large-scale systems / Estimation de la probabilité et l'incertitude des événements indésirables des grands systèmes

Hou, Yunhui 31 March 2016 (has links)
L’objectif de cette thèse est de construire un framework qui représente les incertitudes aléatoires et épistémiques basé sur les approches probabilistes et des théories d’incertain, de comparer les méthodes et de trouver les propres applications sur les grands systèmes avec événement rares. Dans la thèse, une méthode de normalité asymptotique a été proposée avec simulation de Monte Carlo dans les cas binaires ainsi qu'un modèle semi-Markovien dans les cas de systèmes multi-états dynamiques. On a aussi appliqué la théorie d’ensemble aléatoire comme un modèle de base afin d’évaluer la fiabilité et les autres indicateurs de performance dans les systèmes binaires et multi-états avec technique bootstrap. / Our research objective is to build frameworks representing both aleatory and epistemic uncertainties based on probabilistic approach and uncertainty approaches and to compare these methods and find the proper applicatin for these methods in large scale systems with rare event. In this thesis, an asymptotic normality method is proposed with Monte Carlo simulation in case of binary systems as well as semi-Markov model for cases of dynamic multistate system. We also apply random set as a basic model to evaluate system reliability and other performance indices on binary and multistate systems with bootstrap technique.

Page generated in 0.0824 seconds