1 |
Methods for Rigorous Uncertainty Quantification with Application to a Mars Atmosphere ModelBalch, Michael Scott 08 January 2011 (has links)
The purpose of this dissertation is to develop and demonstrate methods appropriate for the quantification and propagation of uncertainty in large, high-consequence engineering projects. The term "rigorous uncertainty quantification" refers to methods equal to the proposed task. The motivating practical example is uncertainty in a Mars atmosphere model due to the incompletely characterized presence of dust.
The contributions made in this dissertation, though primarily mathematical and philosophical, are driven by the immediate needs of engineers applying uncertainty quantification in the field. Arguments are provided to explain how the practical needs of engineering projects like Mars lander missions motivate the use of the objective probability bounds approach, as opposed to the subjectivist theories which dominate uncertainty quantification in many research communities. An expanded formalism for Dempster-Shafer structures is introduced, allowing for the representation of continuous random variables and fuzzy variables as Dempster-Shafer structures. Then, the correctness and incorrectness of probability bounds analysis and the Cartesian product propagation method for Dempster-Shafer structures under certain dependency conditions are proven. It is also conclusively demonstrated that there exist some probability bounds problems in which the best-possible bounds on probability can not be represented using Dempster-Shafer structures. Nevertheless, Dempster-Shafer theory is shown to provide a useful mathematical framework for a wide range of probability bounds problems.
The dissertation concludes with the application of these new methods to the problem of propagating uncertainty from the dust parameters in a Mars atmosphere model to uncertainty in that model's prediction of atmospheric density. A thirty-day simulation of the weather at Holden Crater on Mars is conducted using a meso-scale atmosphere model, MRAMS. Although this analysis only addresses one component of Mars atmosphere uncertainty, it demonstrates the applicability of probability bounds methods in practical engineering work. More importantly, the Mars atmosphere uncertainty analysis provides a framework in which to conclusively establish the practical importance of epistemology in rigorous uncertainty quantification. / Ph. D.
|
2 |
Risk analysis in tunneling with imprecise probabilitiesYou, Xiaomin 09 November 2010 (has links)
Due to the inherent uncertainties in ground and groundwater conditions, tunnel projects often have to face potential risks of cost overrun or schedule delay. Risk analysis has become a required tool (by insurers, Federal Transit Administration, etc.) to identify and quantify risk, as well as visualize causes and effects, and the course (chain) of events. Various efforts have been made to risk assessment and analysis by using conventional methodologies with precise probabilities. However, because of limited information or experience in similar tunnel projects, available evidence in risk assessment and analysis usually relies on judgments from experienced engineers and experts. As a result, imprecision is involved in probability evaluations. The intention of this study is to explore the use of the theory of imprecise probability as applied to risk analysis in tunneling. The goal of the methodologies proposed in this study is to deal with imprecise information without forcing the experts to commit to assessments that they do not feel comfortable with or the analyst to pick a single distribution when the available data does not warrant such precision.
After a brief introduction to the theory of imprecise probability, different types of interaction between variables are studied, including unknown interaction, different types of independence, and correlated variables. Various algorithms aiming at achieving upper and lower bounds on previsions and conditional probabilities with assumed interaction type are proposed. Then, methodologies have been developed for risk registers, event trees, fault trees, and decision trees, i.e. the standard tools in risk assessment for underground projects. Corresponding algorithms are developed and illustrated by examples. Finally, several case histories of risk analysis in tunneling are revisited by using the methodologies developed in this study. All results obtained based on imprecise probabilities are compared with the results from precise probabilities. / text
|
3 |
An Interval Based Approach To Model Input Uncertainty In Discrete-event SimulationBatarseh, Ola 01 January 2010 (has links)
The objective of this research is to increase the robustness of discrete-event simulation (DES) when input uncertainties associated models and parameters are present. Input uncertainties in simulation have different sources, including lack of data, conflicting information and beliefs, lack of introspection, measurement errors, and lack of information about dependency. A reliable solution is obtained from a simulation mechanism that accounts for these uncertainty components in simulation. An interval-based simulation (IBS) mechanism based on imprecise probabilities is proposed, where the statistical distribution parameters in simulation are intervals instead of precise real numbers. This approach incorporates variability and uncertainty in systems. In this research, a standard procedure to estimate interval parameters of probability distributions is developed based on the measurement of simulation robustness. New mechanisms based on the inverse transform to generate interval random variates are proposed. A generic approach to specify the required replication length to achieve a desired level of robustness is derived. Furthermore, three simulation clock advancement approaches in the interval-based simulation are investigated. A library of Java-based IBS toolkits that simulates queueing systems is developed to demonstrate the new proposed reliable simulation. New interval statistics for interval data analysis are proposed to support decision making. To assess the performance of the IBS, we developed an interval-based metamodel for automated material handling systems, which generates interval performance measures that are more reliable and computationally more efficient than traditional DES simulation results.
|
4 |
[en] FUZZY PROBABILITY ESTIMATION FROM IMPRECISE DATA / [pt] ESTIMAÇÃO DE PROBABILIDADE FUZZY A PARTIR DE DADOS IMPRECISOSALEXANDRE ROBERTO RENTERIA 20 April 2007 (has links)
[pt] Existem três tipos de incerteza: a de natureza aleatória,
a gerada pelo conhecimento
incompleto e a que ocorre em função do conhecimento vago
ou impreciso. Há casos em que
dois tipos de incerteza estão presentes, em especial nos
experimentos aleatórios a partir de
dados imprecisos. Para modelar a aleatoriedade quando a
distribuição de probabilidade que
rege o experimento não é conhecida, deve-se utilizar um
método de estimação nãoparamétrico,
tal como a janela de Parzen. Já a incerteza de medição,
presente em qualquer
medida de uma grandeza física, dá origem a dados
imprecisos, tradicionalmente modelados
por conceitos probabilísticos. Entretanto, como a
probabilidade se aplica à análise de
eventos aleatórios, mas não captura a imprecisão no
evento, esta incerteza pode ser melhor
representada por um número fuzzy segundo a transformação
probabilidade-possibilidade
superior. Neste trabalho é proposto um método de estimação
não-paramétrico baseado em
janela de Parzen para estimação da probabilidade fuzzy a
partir de dados imprecisos. / [en] There are three kinds of uncertainty: one due to
randomness, another due to
incomplete knowledge and a third one due to vague or
imprecise knowledge. Sometimes
two kinds of uncertainty occur at the same time,
especially in random experiments based on
imprecise data. To model randomness when the probability
distribution related to an
experiment is unknown, a non-parametric estimation method
must be used, such as the
Parzen window. Uncertainty in measurement originates
imprecise data, traditionally
modelled through probability concepts. However, as
probability applies to random events
but does not capture their imprecision, this sort of
uncertainty is better represented by a
fuzzy number, through the superior probability-possibility
transformation. This thesis
proposes a non-parametric estimation method based on
Parzen window to estimate fuzzy
probability from imprecise data.
|
5 |
Imprecise probability analysis for integrated assessment of climate changeKriegler, Elmar January 2005 (has links)
<p> We present an application of imprecise probability theory to the quantification of uncertainty in the integrated assessment of climate change. Our work is motivated by the fact that uncertainty about climate change is pervasive, and therefore requires a thorough treatment in the integrated assessment process. Classical probability theory faces some severe difficulties
in this respect, since it cannot capture very poor states of information in a satisfactory manner. A more general framework is provided by imprecise probability theory, which offers a similarly firm evidential and behavioural foundation, while at the same time allowing to capture more diverse states of information. An imprecise probability describes the information in terms of lower and upper bounds on probability.</p>
<p> For the purpose of our imprecise probability analysis, we construct a
diffusion ocean energy balance climate model that parameterises the global mean temperature response to secular trends in the radiative forcing in terms of climate sensitivity and effective vertical ocean heat diffusivity. We compare the model behaviour to the 20th century temperature record in order to derive a likelihood function for these two parameters and the forcing strength of anthropogenic sulphate aerosols. Results show a strong positive correlation between climate sensitivity and ocean heat diffusivity, and between climate sensitivity and absolute strength of the sulphate forcing.</p>
<p> We identify two suitable imprecise probability classes for an efficient representation of the uncertainty about the climate model parameters and provide an algorithm to construct a belief function for the prior parameter uncertainty from a set of probability constraints that can be deduced from the literature or observational data. For the purpose of updating the prior with the likelihood function, we establish a methodological framework that allows us to perform the updating procedure efficiently for two different updating rules: Dempster's rule of conditioning and the Generalised Bayes' rule. Dempster's rule yields a posterior belief function in good qualitative agreement with previous studies that tried to constrain climate sensitivity and sulphate aerosol cooling. In contrast, we are not able to produce meaningful imprecise posterior probability bounds from the application of the Generalised Bayes' Rule. We can attribute this result mainly to our choice of representing the prior uncertainty by a belief function.</p>
<p> We project the Dempster-updated belief function for the climate model parameters onto estimates of future global mean temperature change under several emissions scenarios for the 21st century, and several long-term stabilisation policies. Within the limitations of our analysis we find that it requires a stringent stabilisation level of around 450 ppm carbon dioxide equivalent concentration to obtain a non-negligible lower probability of limiting the warming to 2 degrees Celsius. We discuss several frameworks of decision-making under ambiguity and show that they can lead to a variety of, possibly imprecise, climate policy recommendations. We find, however, that poor states of information do not necessarily impede a useful policy
advice.</p>
<p> We conclude that imprecise probabilities constitute indeed a promising
candidate for the adequate treatment of uncertainty in the integrated assessment of climate change. We have constructed prior belief functions that allow much weaker assumptions on the prior state of information than a prior probability would require and, nevertheless, can be propagated through the entire assessment process. As a caveat, the updating issue needs further investigation. Belief functions constitute only a sensible choice for the prior uncertainty representation if more restrictive updating rules than the Generalised Bayes'Rule are available.</p> / <p> Diese Arbeit untersucht die Eignung der Theorie der unscharfen Wahrscheinlichkeiten für die Beschreibung der Unsicherheit in der integrierten Analyse des Klimawandels. Die wissenschaftliche Unsicherheit bezüglich vieler Aspekte des Klimawandels ist beträchtlich, so dass ihre angemessene Beschreibung von großer Wichtigkeit ist. Die klassische Wahrscheinlichkeitstheorie weist in diesem Zusammenhang einige Probleme auf, da sie Zustände sehr geringer Information nicht zufriedenstellend beschreiben kann. Die unscharfe Wahrscheinlichkeitstheorie bietet ein gleichermaßen fundiertes Theoriegebäude, welches jedoch eine größere Flexibilität bei der Beschreibung verschiedenartiger Informationszustände erlaubt. Unscharfe Wahrscheinlichkeiten erfassen solche Informationszustände durch die Spezifizierung von unteren und oberen Grenzen an zulässige Werte der Wahrscheinlichkeit.</p>
<p> Unsere Analyse des Klimawandels beruht auf einem Energiebilanzmodell mit diffusivem Ozean, welches die globale Temperaturantwort auf eine Änderung der Strahlungsbilanz in Abhängigkeit von zwei Parametern beschreibt: die Klimasensitivität, und die effektive vertikale Wärmediffusivität im Ozean. Wir vergleichen das Modellverhalten mit den Temperaturmessungen des 20. Jahrhunderts, um eine sogenannte Likelihood-Funktion für die Hypothesen zu diesen beiden Parametern sowie dem kühlenden Einfluss der Sulfataerosole zu ermitteln. Im Ergebnis zeigt sich eine stark positive Korrelation zwischen Klimasensitivität und Wärmediffusivität im Ozean, und Klimasensitivität und kühlendem Einfluss der Sulfataerosole.</p>
<p> Für die effiziente Beschreibung der Parameterunsicherheit ziehen wir zwei geeignete Modelltypen aus der unscharfen Wahrscheinlichkeitstheorie heran. Wir formulieren einen Algorithmus, der den Informationsgehalt beider Modelle durch eine sogenannte Belief-Funktion beschreibt. Mit Hilfe dieses Algorithmus
konstruieren wir Belief-Funktionen für die A-priori-Parameterunsicherheit auf der Grundlage von divergierenden Wahrscheinlichkeitsschätzungen in der Literatur bzw. Beobachtungsdaten. Wir leiten eine Methode her, um die A-priori-Belief-Funktion im Lichte der Likelihood-Funktion zu aktualisieren. Dabei ziehen wir zwei verschiedene Regeln zur Durchführung des Lernprozesses in Betracht: die Dempstersche Regel und die verallgemeinerte Bayessche Regel. Durch Anwendung der Dempsterschen Regel erhalten wir eineA-posteriori-Belief-Funktion, deren Informationsgehalt qualitativ mit den Ergebnissen bisheriger Studien übereinstimmt, die eine Einschränkung der Unsicherheit über die Klimasensitivität und die kühlende Wirkung der Sulfataerosole versucht haben. Im Gegensatz dazu finden wir bei Anwendung der verallgemeinerten Bayesschen Regel keine sinnvollen unteren und oberen Grenzen an die A-posteriori-Wahrscheinlichkeit. Wir stellen fest, dass dieses Resultat maßgeblich durch die Wahl einer Belief-Funktion zur Beschreibung der A-priori-Unsicherheit bedingt ist.</p>
<p> Die A-posteriori-Belief-Funktion für die Modellparameter, die wir aus der Anwendung der Dempsterschen Regel erhalten haben, wird zur Abschätzung des zukünftigen Temperaturanstiegs eingesetzt. Wir betrachten verschiedene Emissionsszenarien für das 21. Jahrhundert sowie verschiedene Stabilisierungsziele für den Treibhausgasgehalt in der Atmosphäre. Im Rahmen unserer Analyse finden wir, dass sehr strikte Stabilisierungsziele im Bereich einer Kohlendioxid-Äquivalentkonzentration von ca. 450 ppm in der Atmosphäre notwendig sind, um nicht eine vernachlässigbar kleine untere Wahrscheinlichkeit für die Begrenzung der Erwärmung auf 2 Grad Celsius zu erhalten. Wir diskutieren verschiedene Kriterien für die Entscheidungsfindung unter unscharfer Wahrscheinlichkeit, und zeigen dass sie zu verschiedenen teilweise unscharfen Politikempfehlungen führen können. Nichtsdestotrotz stellen wir fest, dass eine klare Politikempfehlung auch bei Zuständen schwacher Information möglich sein kann.</p>
<p> Wir schließen, dass unscharfe Wahrscheinlichkeiten tatsächlich ein geeignetes Mittel zur Beschreibung der Unsicherheit in der integrierten Analyse des Klimawandels darstellen. Wir haben Algorithmen zur Generierung und Weiterverarbeitung von Belief-Funktionen etabliert, die eine deutlich größere A-priori-Unsicherheit beschreiben können, als durch eine A-priori-Wahrscheinlichkeit möglich wäre. Allerdings erfordert die Frage des Lernprozesses für unscharfe Wahrscheinlichkeiten eine weitergehende Untersuchung. Belief-Funktionen stellen nur dann eine vernünftige Wahl für die Beschreibung der A-priori-Unsicherheit dar, wenn striktere Regeln als die verallgemeinerte Bayessche Regel für den Lernprozess gerechtfertigt werden können.</p>
|
6 |
Bayesian Methods Under Unknown Prior Distributions with Applications to The Analysis of Gene Expression DataRahal, Abbas 14 July 2021 (has links)
The local false discovery rate (LFDR) is one of many existing statistical methods that analyze multiple hypothesis testing. As a Bayesian quantity, the LFDR is based on the prior probability of the null hypothesis and a mixture distribution of null and non-null hypothesis. In practice, the LFDR is unknown and needs to be estimated. The empirical Bayes approach can be used to estimate that mixture distribution.
Empirical Bayes does not require complete information about the prior and hyper prior distributions as in hierarchical Bayes. When we do not have enough information at the prior level, and instead of placing a distribution at the hyper prior level in the hierarchical Bayes model, empirical Bayes estimates the prior parameters using the data via, often, the marginal distribution.
In this research, we developed new Bayesian methods under unknown prior distribution. A set of adequate prior distributions maybe defined using Bayesian model checking by setting a threshold on the posterior predictive p-value, prior predictive p-value, calibrated p-value, Bayes factor, or integrated likelihood. We derive a set of adequate posterior distributions from that set. In order to obtain a single posterior distribution instead of a set of adequate posterior distributions, we used a blended distribution, which minimizes the relative entropy of a set of adequate prior (or posterior) distributions to a "benchmark" prior (or posterior) distribution. We present two approaches to generate a blended posterior distribution, namely, updating-before-blending and blending-before-updating. The blended posterior distribution can be used to estimate the LFDR by considering the nonlocal false discovery rate as a benchmark and the different LFDR estimators as an adequate set.
The likelihood ratio can often be misleading in multiple testing, unless it is supplemented by adjusted p-values or posterior probabilities based on sufficiently strong prior distributions. In case of unknown prior distributions, they can be estimated by empirical Bayes methods or blended distributions. We propose a general framework for applying the laws of likelihood to problems involving multiple hypotheses by bringing together multiple statistical models.
We have applied the proposed framework to data sets from genomics, COVID-19 and other data.
|
Page generated in 0.106 seconds