• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 10
  • 2
  • 2
  • Tagged with
  • 18
  • 18
  • 6
  • 5
  • 5
  • 5
  • 4
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

The Reliability Assessment and Optimization of Arbitrary-State Monotone Systems under Epistemic Uncertainty / L'évaluation et L'optimisation De La Fiabilité Des Systèmes Monotones et à Etat arbitraire Sous Incertitude Épistémique

Sun, Muxia 03 July 2019 (has links)
Dans ce travail, nous étudions l’évaluation de la fiabilité, la modélisation et l’optimisation de systèmes à états arbitraires à incertitude épistémique. Tout d'abord, une approche universelle de modélisation à l'état arbitraire est proposée afin d'étudier efficacement les systèmes industriels modernes aux structures, mécanismes de fonctionnement et exigences de fiabilité de plus en plus complexes. De simples implémentations de modèles de fiabilité binaires, continus ou multi-états traditionnels ont montré leurs lacunes en termes de manque de généralité lors de la modélisation de structures, systèmes, réseaux et systèmes de systèmes industriels modernes et complexes. Dans ce travail, nous intéressons aussi particulièrement aux systèmes monotones, non seulement parce que la monotonie est apparue couramment dans la plupart des modèles de fiabilité standard, mais aussi qu’une propriété mathématique aussi simple permet une simplification énorme de nombreux problèmes extrêmement complexes. Ensuite, pour les systèmes de fiabilité monotones à états arbitraires, nous essayons de résoudre les problèmes suivants, qui sont apparus dans les principes mêmes de la modélisation mathématique: 1. L’évaluation de la fiabilité dans un environnement incertain épistémique avec des structures hiérarchiques être exploitées par toute approche de programmation 2; l'optimisation de la fiabilité / maintenance pour les systèmes à grande fiabilité avec incertitude épistémique. / In this work, we study the reliability assessment, modeling and optimization of arbitrary-state systems with epistemic uncertainty. Firstly, a universal arbitrary-state modelling approach is proposed, in order to effectively study the modern industrial systems with increasingly complicated structures, operation mechanisms and reliability demands. Simple implementations of traditional binary, continuous or multi-state reliability models have been showing their deficiencies in lack of generality, when modelling such complex modern industrial structures, systems, networks and systems-of-systems. In this work, we are also particularly interested in monotone systems, not only because monotonicity commonly appeared in most of the standard reliability models, but also that such a simple mathematical property allows a huge simplification to many extremely complex problems. Then, for the arbitrary-state monotone reliability systems, we try to solve the following challenges that appeared in its very fundamentals of mathematical modeling: 1. The reliability assessment under epistemic uncertain environment with hierarchy structures; 2. The reliability/maintenance optimization for large reliability systems under epistemic uncertainty.
12

A Statistical Framework for Distinguishing Between Aleatory and Epistemic Uncertainties in the Best- Estimate Plus Uncertainty (BEPU) Nuclear Safety Analyses

Pun-Quach, Dan 11 1900 (has links)
In 1988, the US Nuclear Regulatory Commission approved an amendment that allowed the use of best-estimate methods. This led to an increased development, and application of Best Estimate Plus Uncertainty (BEPU) safety analyses. However, a greater burden was placed on the licensee to justify all uncertainty estimates. A review of the current state of the BEPU methods indicate that there exists a number of significant criticisms, which limits the BEPU methods from reaching its full potential as a comprehensive licensing basis. The most significant criticism relates to the lack of a formal framework for distinguishing between aleatory and epistemic uncertainties. This has led to a prevalent belief that such separation of uncertainties is for convenience, rather than one out of necessity. In this thesis, we address the above concerns by developing a statistically rigorous framework to characterize the different uncertainty types. This framework is grounded on the philosophical concepts of knowledge. Considering the Plato problem, we explore the use of probability as a means to gain knowledge, which allows us to relate the inherent distinctness in knowledge with the different uncertaintytypesforanycomplexphysicalsystem. Thisframeworkis demonstrated using nuclear analysis problems, and we show through the use of structural models that the separation of these uncertainties leads to more accurate tolerance limits relative to existing BEPU methods. In existing BEPU methods, where such a distinction is not applied, the total uncertainty is essentially treated as the aleatory uncertainty. Thus, the resulting estimated percentile is much larger than the actual (true) percentile of the system's response. Our results support the premise that the separation of these two distinct uncertainty types is necessary and leads to more accurate estimates of the reactor safety margins. / Thesis / Doctor of Philosophy (PhD)
13

Uncertainty-aware deep learning for prediction of remaining useful life of mechanical systems

Cornelius, Samuel J 10 December 2021 (has links)
Remaining useful life (RUL) prediction is a problem that researchers in the prognostics and health management (PHM) community have been studying for decades. Both physics-based and data-driven methods have been investigated, and in recent years, deep learning has gained significant attention. When sufficiently large and diverse datasets are available, deep neural networks can achieve state-of-the-art performance in RUL prediction for a variety of systems. However, for end users to trust the results of these models, especially as they are integrated into safety-critical systems, RUL prediction uncertainty must be captured. This work explores an approach for estimating both epistemic and heteroscedastic aleatoric uncertainties that emerge in RUL prediction deep neural networks and demonstrates that quantifying the overall impact of these uncertainties on predictions reveal valuable insight into model performance. Additionally, a study is carried out to observe the effects of RUL truth data augmentation on perceived uncertainties in the model.
14

A Probabilistic Decision Support System for a Performance-Based Design of Infrastructures

Shahtaheri, Yasaman 20 August 2018 (has links)
Infrastructures are the most fundamental facilities and systems serving the society. Due to the existence of infrastructures in economic, social, and environmental contexts, all lifecycle phases of such fundamental facilities should maximize utility for the designers, occupants, and the society. With respect to the nature of the decision problem, two main types of uncertainties may exist: 1) the aleatory uncertainty associated with the nature of the built environment (i.e., the economic, social, and environmental impacts of infrastructures must be described as probabilistic); and 2) the epistemic uncertainty associated with the lack of knowledge of decision maker utilities. Although a number of decision analysis models exist that consider the uncertainty associated with the nature of the built environment, they do not provide a systematic framework for including aleatory and epistemic uncertainties, and decision maker utilities in the decision analysis process. In order to address the identified knowledge gap, a three-phase modular decision analysis methodology is proposed. Module one uses a formal preference assessment methodology (i.e., utility function/indifference curve) for assessing decision maker utility functions with respect to a range of alternative design configurations. Module two utilizes the First Order Reliability Method (FORM) in a systems reliability approach for assessing the reliability of alternative infrastructure design configurations with respect to the probabilistic decision criteria and decision maker defined utility functions (indifference curves), and provides a meaningful feedback loop for improving the reliability of the alternative design configurations. Module three provides a systematic framework to incorporate both aleatory and epistemic uncertainties in the decision analysis methodology (i.e., uncertain utility functions and group decision making). The multi-criteria, probabilistic decision analysis framework is tested on a nine-story office building in a seismic zone with the probabilistic decision criteria of: building damage and business interruption costs, casualty costs, and CO2 emission costs. Twelve alternative design configurations and four decision maker utility functions under aleatory and epistemic uncertainties are utilized. The results of the decision analysis methodology revealed that the high-performing design configurations with an initial cost of up to $3.2M (in a cost range between $1.7M and $3.2M), a building damage and business interruption cost as low as $303K (in a cost range between $303K and $6.2M), a casualty cost as low as $43K (in a cost range between $43K and $1.2M), and a CO2 emission as low as $146K (in a cost range between $133K to $150K) can be identified by having a higher probability (i.e., up to 80%) of meeting the decision makers' preferences. The modular, holistic, decision analysis framework allows decision makers to make more informed performance-based design decisions—and allows designers to better incorporate the preferences of the decision makers—during the early design process. / PHD / Infrastructures, including buildings, roads, and bridges, are the most fundamental facilities and systems serving the society. Because infrastructures exist in economic, social, and environmental contexts, the design, construction, operations, and maintenance phases of such fundamental facilities should maximize value and usability for the designers, occupants, and the society. Identifying infrastructure configurations that maximize value and usability is challenged by two sources of uncertainty: 1) the nature of the built environment is variable (i.e., whether or not a natural hazard will occur during the infrastructure lifetime, or how costs might change over time); and 2) there is lack of knowledge of decision maker preferences and values (e.g., design cost versus social impact tradeoffs). Although a number of decision analysis models exist that consider the uncertainty associated with the nature of the built environment (e.g., natural hazard events), they do not provide a systematic framework for including the uncertainties associated with the decision analysis process (e.g., lack of knowledge about decision maker preferences), and decision maker requirements in the decision analysis process. In order to address the identified knowledge gap, a three-phase modular decision analysis methodology is proposed. Module one uses a formal preference assessment methodology for assessing decision maker values with respect to a range of alternative design configurations. Module two utilizes an algorithm for assessing the reliability of alternative infrastructure design configurations with respect to the probabilistic decision criteria and decision maker requirements, and provides a meaningful feedback loop for understanding the decision analysis results (i.e., improving the value and usability of the alternative design configurations). Module three provides a systematic framework to incorporate both the random uncertainty associated with the built environment and the knowledge uncertainty associated with lack of knowledge of decision maker preferences, and tests the reliability of the decision analysis results under random and knowledge uncertainties (i.e., uncertain decision maker preferences and group decision making). The holistic decision analysis framework is tested on a nine-story office building in a seismic zone with the probabilistic decision criteria of: building damage and business interruption costs, casualty costs, and CO2 emission costs. Twelve alternative design configurations, four decision makers, and random and knowledge sources of uncertainty are considered in the decision analysis methodology. Results indicate that the modular, holistic, decision analysis framework allows decision makers to make more informed design decisions—and allows designers to better incorporate the preferences of the decision makers—during the early design process.
15

Uncertainty management in parameter identification / Gestion des incertitudes pour l'identification des paramètres matériau

Sui, Liqi 23 January 2017 (has links)
Afin d'obtenir des simulations plus prédictives et plus précises du comportement mécanique des structures, des modèles matériau de plus en plus complexes ont été développés. Aujourd'hui, la caractérisation des propriétés des matériaux est donc un objectif prioritaire. Elle exige des méthodes et des tests d'identification dédiés dans des conditions les plus proches possible des cas de service. Cette thèse vise à développer une méthodologie d'identification efficace pour trouver les paramètres des propriétés matériau, en tenant compte de toutes les informations disponibles. L'information utilisée pour l'identification est à la fois théorique, expérimentale et empirique : l'information théorique est liée aux modèles mécaniques dont l'incertitude est épistémique; l'information expérimentale provient ici de la mesure de champs cinématiques obtenues pendant l'essai ct dont l'incertitude est aléatoire; l'information empirique est liée à l'information à priori associée à une incertitude épistémique ainsi. La difficulté principale est que l'information disponible n'est pas toujours fiable et que les incertitudes correspondantes sont hétérogènes. Cette difficulté est surmontée par l'utilisation de la théorie des fonctions de croyance. En offrant un cadre général pour représenter et quantifier les incertitudes hétérogènes, la performance de l'identification est améliorée. Une stratégie basée sur la théorie des fonctions de croyance est proposée pour identifier les propriétés élastiques macro et micro des matériaux multi-structures. Dans cette stratégie, les incertitudes liées aux modèles et aux mesures sont analysées et quantifiées. Cette stratégie est ensuite étendue pour prendre en compte l'information à priori et quantifier l'incertitude associée. / In order to obtain more predictive and accurate simulations of mechanical behaviour in the practical environment, more and more complex material models have been developed. Nowadays, the characterization of material properties remains a top-priority objective. It requires dedicated identification methods and tests in conditions as close as possible to the real ones. This thesis aims at developing an effective identification methodology to find the material property parameters, taking advantages of all available information. The information used for the identification is theoretical, experimental, and empirical: the theoretical information is linked to the mechanical models whose uncertainty is epistemic; the experimental information consists in the full-field measurement whose uncertainty is aleatory; the empirical information is related to the prior information with epistemic uncertainty as well. The main difficulty is that the available information is not always reliable and its corresponding uncertainty is heterogeneous. This difficulty is overcome by the introduction of the theory of belief functions. By offering a general framework to represent and quantify the heterogeneous uncertainties, the performance of the identification is improved. The strategy based on the belief function is proposed to identify macro and micro elastic properties of multi-structure materials. In this strategy, model and measurement uncertainties arc analysed and quantified. This strategy is subsequently developed to take prior information into consideration and quantify its corresponding uncertainty.
16

Uncertainty Estimation in Radiation Dose Prediction U-Net / Osäkerhetsskattning för stråldospredicerande U-Nets

Skarf, Frida January 2023 (has links)
The ability to quantify uncertainties associated with neural network predictions is crucial when they are relied upon in decision-making processes, especially in safety-critical applications like radiation therapy. In this paper, a single-model estimator of both epistemic and aleatoric uncertainties in a regression 3D U-net used for radiation dose prediction is presented. To capture epistemic uncertainty, Monte Carlo Dropout is employed, leveraging dropout during test-time inference to obtain a distribution of predictions. The variability among these predictions is used to estimate the model’s epistemic uncertainty. For quantifying aleatoric uncertainty quantile regression, which models conditional quantiles of the output distribution, is used. The method enables the estimation of prediction intervals of a user-specified significance level, where the difference between the upper and lower bound of the interval quantifies the aleatoric uncertainty. The proposed approach is evaluated on two datasets of prostate and breast cancer patient geometries and corresponding radiation doses. Results demonstrate that the quantile regression method provides well-calibrated prediction intervals, allowing for reliable aleatoric uncertainty estimation. Furthermore, the epistemic uncertainty obtained through Monte Carlo Dropout proves effective in identifying out-of-distribution examples, highlighting its usefulness for detecting anomalous cases where the model makes uncertain predictions. / Förmågan att kvantifiera osäkerheter i samband med neurala nätverksprediktioner är avgörande när de åberopas i beslutsprocesser, särskilt i säkerhetskritiska tillämpningar såsom strålterapi. I denna rapport presenteras en en-modellsimplementation för att uppskatta både epistemiska och aleatoriska osäkerheter i ett 3D regressions-U-net som används för att prediktera stråldos. För att fånga epistemisk osäkerhet används Monte Carlo Dropout, som utnyttjar dropout under testtidsinferens för att få en fördelning av prediktioner. Variabiliteten mellan dessa prediktioner används för att uppskatta modellens epistemiska osäkerhet. För att kvantifiera den aleatoriska osäkerheten används kvantilregression, eller quantile regression, som modellerar de betingade kvantilerna i outputfördelningen. Metoden möjliggör uppskattning av prediktionsintervall med en användardefinierad signifikansnivå, där skillnaden mellan intervallets övre och undre gräns kvantifierar den aleatoriska osäkerheten. Den föreslagna metoden utvärderas på två dataset innehållandes geometrier för prostata- och bröstcancerpatienter och korresponderande stråldoser. Resultaten visar på att kvantilregression ger välkalibrerade prediktionsintervall, vilket tillåter en tillförlitlig uppskattning av den aleatoriska osäkerheten. Dessutom visar sig den epistemiska osäkerhet som erhålls genom Monte Carlo Dropout vara användbar för att identifiera datapunkter som inte tillhör samma fördelning som träningsdatan, vilket belyser dess lämplighet för att upptäcka avvikande datapunkter där modellen gör osäkra prediktioner.
17

Advances in uncertainty modelling : from epistemic uncertainty estimation to generalized generative flow networks

Lahlou, Salem 08 1900 (has links)
Les problèmes de prise de décision se produisent souvent dans des situations d'incertitude, englobant à la fois l'incertitude aléatoire due à la présence de processus inhérents aléatoires et l'incertitude épistémique liée aux connaissances limitées. Cette thèse explore le concept d'incertitude, un aspect crucial de l'apprentissage automatique et un facteur clé pour que les agents rationnels puissent déterminer où allouer leurs ressources afin d'obtenir les meilleurs résultats. Traditionnellement, l'incertitude est encodée à travers une probabilité postérieure, obtenue par des techniques d'inférence Bayésienne approximatives. Le premier ensemble de contributions de cette thèse tourne autour des propriétés mathématiques des réseaux de flot génératifs, qui sont des modèles probabilistes de séquences discrètes et des échantillonneurs amortis de distributions de probabilités non normalisées. Les réseaux de flot génératifs trouvent des applications dans l'inférence Bayésienne et peuvent être utilisés pour l'estimation de l'incertitude. De plus, ils sont utiles pour les problèmes de recherche dans de vastes espaces compositionnels. Au-delà du renforcement du cadre mathématique sous-jacent, une étude comparative avec les méthodes variationnelles hiérarchiques est fournie, mettant en lumière les importants avantages des réseaux de flot génératifs, tant d'un point de vue théorique que par le biais d'expériences diverses. Ces contributions incluent une théorie étendant les réseaux de flot génératifs à des espaces continus ou plus généraux, ce qui permet de modéliser la probabilité postérieure et l'incertitude dans de nombreux contextes intéressants. La théorie est validée expérimentalement dans divers domaines. Le deuxième axe de travail de cette thèse concerne les mesures alternatives de l'incertitude épistémique au-delà de la modélisation de la probabilité postérieure. La méthode présentée, appelée Estimation Directe de l'Incertitude Épistémique (DEUP), surmonte une faiblesse majeure des techniques d'inférence Bayésienne approximatives due à la mauvaise spécification du modèle. DEUP repose sur le maintien d'un prédicteur secondaire des erreurs du prédicteur principal, à partir duquel des mesures d'incertitude épistémique peuvent être déduites. / Decision-making problems often occur under uncertainty, encompassing both aleatoric uncertainty arising from inherent randomness in processes and epistemic uncertainty due to limited knowledge. This thesis explores the concept of uncertainty, a crucial aspect of machine learning and a key factor for rational agents to determine where to allocate their resources for achieving the best possible results. Traditionally, uncertainty is encoded in a posterior distribution, obtained by approximate \textit{Bayesian} inference techniques. This thesis's first set of contributions revolves around the mathematical properties of generative flow networks, which are probabilistic models over discrete sequences and amortized samplers of unnormalized probability distributions. Generative flow networks find applications in Bayesian inference and can be used for uncertainty estimation. Additionally, they are helpful for search problems in large compositional spaces. Beyond deepening the mathematical framework underlying them, a comparative study with hierarchical variational methods is provided, shedding light on the significant advantages of generative flow networks, both from a theoretical point of view and via diverse experiments. These contributions include a theory extending generative flow networks to continuous or more general spaces, which allows modelling the Bayesian posterior and uncertainty in many interesting settings. The theory is experimentally validated in various domains. This thesis's second line of work is about alternative measures of epistemic uncertainty beyond posterior modelling. The presented method, called Direct Epistemic Uncertainty Estimation (DEUP), overcomes a major shortcoming of approximate Bayesian inference techniques caused by model misspecification. DEUP relies on maintaining a secondary predictor of the errors of the main predictor, from which measures of epistemic uncertainty can be deduced.
18

Approche communicationnelle de l'incertitude dans les projets innovants en phase de lancement / Communicational approach of the uncertainty in the initial phase of innovative projects

Camin, Jean-Michel 03 December 2014 (has links)
Alors que les principales activités d’un chef de projet s’effectuent à travers le processus de communication, on observe que de nombreux projets font l'objet de retards, dérives ou défauts de spécifications. Excès de mesures pour prévenir le risque ou gestion déficiente de la communication laissant trop de place à l’incertitude ? La Théorie de la Réduction de l’Incertitude développée par Berger et Calabrese (1975) dans le champ de la communication ne permet pas de totalement saisir comment un chef de projet dissipe l’incertitude existante entre les acteurs. En revisitant un projet opérationnel dans le cadre d’une recherche-action, nous nous employons à identifier comment incertitude et communication s’influencent et se structurent mutuellement. Nous avons convoqué l’approche constructiviste et la théorie de l’acteur-réseau de Callon et Latour pour accéder au sens de cette relation circulaire. Nous avançons les hypothèses selon lesquelles : - L’incertitude est un actant qui intervient dans la construction du réseau (au sens de Bruno Latour (2007) « ce qui est tracé par les traductions »).- Le processus de communication diffère suivant la nature de l’incertitude rencontrée ou ressentie. - Le processus de communication performe et scelle les relations en les rendant si coûteuses à défaire et si économiques à maintenir, qu’elles deviennent irréversibles. L’approche communicationnelle de l’incertitude mettra en évidence plusieurs caractéristiques de cet actant, comme sa capacité à peupler un « réseau de manques », la façon dont le réseau se hiérarchise pour faire sens, la description du processus d’estimation continue dont il fait l’objet (l’Incertus). Si nous concevons l’incertitude comme l’attribut d’un phénomène alors la « communication-incertitude » fabrique le sens en même temps qu’elle détermine la valeur de cet attribut. En positionnant l’incertitude comme un phénomène socialement construit, nous présentons un modèle constructiviste de « communication-incertitude » où l’observateur est un acteur intentionnel limité par des contraintes (Boudon, 2009) et proposons de distinguer la nature de l’incertitude suivant une typologie : l’incertitude de variabilité (inhérente à la variabilité des choses), l’incertitude épistémique ambiguë ou non (due à l’imperfection de nos connaissances) et l’incertitude d'échelle (en rapport avec l’imperfection de nos modèles de représentations). Dans ce mouvement vers l’irréversibilité, les processus de communication participent au remplacement des médiateurs (qui transforment, redonnent du sens, font faire des choses inattendues) par des intermédiaires (qui transmettent, transfèrent sans modifier) et les actants les plus réversibles sont évacués vers la périphérie du réseau. / While the main activities of a project manager are done through the communication process, it is observed that many projects are subject to delays, excesses or defects specifications. Excess of measurements to prevent the risk ? Defective management of the communication which leaves too much place to uncertainty ? The Theory of Uncertainty Reduction developed by Berger and Calabrese (1975) in the field of communication does not fully understand how a project dissipates the existing uncertainty between actors. By revisiting an operational project within the framework of action research, we strive to identify how uncertainty and communication influence and form themselves mutually. We used the constructivist approach and the actor-network theory of Callon and Latour to reach the meaning of this circular relationship. We advance the following hypotheses: - Uncertainty is a nonhuman actor involved in the construction of the network (as defined by Bruno Latour (2007) "which is drawn by translations"). - The communication process differs according to the nature of the uncertainty encountered or felt. - The communication process performs and seals relationships by making them so expensive to undo and so economic to maintain, that they become irreversible. The communicational approach of the uncertainty will highlight several features of this nonhuman actor, as its ability to populate a "network of gaps", the way the network ranks into a hierarchy to make sense, the description of the continuous process of estimation (the Incertus). If we conceive uncertainty as an attribute of a phenomenon, then "communication-uncertainty" makes sense at the same time it determines the value of this attribute. By positioning uncertainty as a socially constructed phenomenon, we present a constructivist model of "communication-uncertainty" where the observer is an intentional actor limited by constraints (Boudon, 2009). We propose to distinguish the nature of uncertainty following a typology: the variability uncertainty (inherent variability of things), the epistemic uncertainty ambiguous or not (due to the imperfection of our knowledge) and the scale uncertainty (in touch with the imperfection of our models of representations). In this movement towards irreversibility, the communication processes involved in replacing mediators (which transform, give meaning, make unexpected things do) by intermediaries (which transmit, transfer without changing) and the most reversible nonhumans actors are evacuated to the network edge.

Page generated in 0.0946 seconds