1 |
Using Poisson processes for rare event simulation / De l'utilisation des processus de Poisson pour la simulation d'événements raresWalter, Clément 21 October 2016 (has links)
Cette thèse est une contribution à la problématique de la simulation d'événements rares. A partir de l'étude des méthodes de Splitting, un nouveau cadre théorique est développé, indépendant de tout algorithme. Ce cadre, basé sur la définition d'un processus ponctuel associé à toute variable aléatoire réelle, permet de définir des estimateurs de probabilités, quantiles et moments sans aucune hypothèse sur la variable aléatoire. Le caractère artificiel du Splitting (sélection de seuils) disparaît et l'estimateur de la probabilité de dépasser un seuil est en fait un estimateur de la fonction de répartition jusqu'au seuil considéré. De plus, les estimateurs sont basés sur des processus ponctuels indépendants et identiquement distribués et permettent donc l'utilisation de machine de calcul massivement parallèle. Des algorithmes pratiques sont ainsi également proposés.Enfin l'utilisation de métamodèles est parfois nécessaire à cause d'un temps de calcul toujours trop important. Le cas de la modélisation par processus aléatoire est abordé. L'approche par processus ponctuel permet une estimation simplifiée de l'espérance et de la variance conditionnelles de la variable aléaoire résultante et définit un nouveau critère d'enrichissement SUR adapté aux événements rares / This thesis address the issue of extreme event simulation. From a original understanding of the Splitting methods, a new theoretical framework is proposed, regardless of any algorithm. This framework is based on a point process associated with any real-valued random variable and lets defined probability, quantile and moment estimators without any hypothesis on this random variable. The artificial selection of threshold in Splitting vanishes and the estimator of the probability of exceeding a threshold is indeed an estimator of the whole cumulative distribution function until the given threshold. These estimators are based on the simulation of independent and identically distributed replicas of the point process. So they allow for the use of massively parallel computer cluster. Suitable practical algorithms are thus proposed.Finally it can happen that these advanced statistics still require too much samples. In this context the computer code is considered as a random process with known distribution. The point process framework lets handle this additional source of uncertainty and estimate easily the conditional expectation and variance of the resulting random variable. It also defines new SUR enrichment criteria designed for extreme event probability estimation.
|
2 |
Bayesian inference in aggregated hidden Markov modelsMarklund, Emil January 2015 (has links)
Single molecule experiments study the kinetics of molecular biological systems. Many such studies generate data that can be described by aggregated hidden Markov models, whereby there is a need of doing inference on such data and models. In this study, model selection in aggregated Hidden Markov models was performed with a criterion of maximum Bayesian evidence. Variational Bayes inference was seen to underestimate the evidence for aggregated model fits. Estimation of the evidence integral by brute force Monte Carlo integration theoretically always converges to the correct value, but it converges in far from tractable time. Nested sampling is a promising method for solving this problem by doing faster Monte Carlo integration, but it was here seen to have difficulties generating uncorrelated samples.
|
3 |
Bayesian methods and machine learning in astrophysicsHigson, Edward John January 2019 (has links)
This thesis is concerned with methods for Bayesian inference and their applications in astrophysics. We principally discuss two related themes: advances in nested sampling (Chapters 3 to 5), and Bayesian sparse reconstruction of signals from noisy data (Chapters 6 and 7). Nested sampling is a popular method for Bayesian computation which is widely used in astrophysics. Following the introduction and background material in Chapters 1 and 2, Chapter 3 analyses the sampling errors in nested sampling parameter estimation and presents a method for estimating them numerically for a single nested sampling calculation. Chapter 4 introduces diagnostic tests for detecting when software has not performed the nested sampling algorithm accurately, for example due to missing a mode in a multimodal posterior. The uncertainty estimates and diagnostics in Chapters 3 and 4 are implemented in the $\texttt{nestcheck}$ software package, and both chapters describe an astronomical application of the techniques introduced. Chapter 5 describes dynamic nested sampling: a generalisation of the nested sampling algorithm which can produce large improvements in computational efficiency compared to standard nested sampling. We have implemented dynamic nested sampling in the $\texttt{dyPolyChord}$ and $\texttt{perfectns}$ software packages. Chapter 6 presents a principled Bayesian framework for signal reconstruction, in which the signal is modelled by basis functions whose number (and form, if required) is determined by the data themselves. This approach is based on a Bayesian interpretation of conventional sparse reconstruction and regularisation techniques, in which sparsity is imposed through priors via Bayesian model selection. We demonstrate our method for noisy 1- and 2-dimensional signals, including examples of processing astronomical images. The numerical implementation uses dynamic nested sampling, and uncertainties are calculated using the methods introduced in Chapters 3 and 4. Chapter 7 applies our Bayesian sparse reconstruction framework to artificial neural networks, where it allows the optimum network architecture to be determined by treating the number of nodes and hidden layers as parameters. We conclude by suggesting possible areas of future research in Chapter 8.
|
4 |
Ursachen des Invasionserfolges von Rhododendron ponticum L. auf den Britischen Inseln: Einfluss von Habitat und Genotyp / The invasion success of Rhododendron ponticum L. in the British Isles: effects of habitat and genotypesErfmeier, Alexandra 27 April 2004 (has links)
No description available.
|
5 |
Zentrale und periphere Populationen von Hornungia petraea: Biodiversität und Demographie auf unterschiedlichen raum-zeitlichen Skalenebenen / Central and peripheral populations of Hornungia petraea: biodiversity and demography on different spatio-temporal scalesKluth, Christian 27 April 2004 (has links)
No description available.
|
6 |
Computational Bayesian techniques applied to cosmologyHee, Sonke January 2018 (has links)
This thesis presents work around 3 themes: dark energy, gravitational waves and Bayesian inference. Both dark energy and gravitational wave physics are not yet well constrained. They present interesting challenges for Bayesian inference, which attempts to quantify our knowledge of the universe given our astrophysical data. A dark energy equation of state reconstruction analysis finds that the data favours the vacuum dark energy equation of state $w {=} -1$ model. Deviations from vacuum dark energy are shown to favour the super-negative ‘phantom’ dark energy regime of $w {< } -1$, but at low statistical significance. The constraining power of various datasets is quantified, finding that data constraints peak around redshift $z = 0.2$ due to baryonic acoustic oscillation and supernovae data constraints, whilst cosmic microwave background radiation and Lyman-$\alpha$ forest constraints are less significant. Specific models with a conformal time symmetry in the Friedmann equation and with an additional dark energy component are tested and shown to be competitive to the vacuum dark energy model by Bayesian model selection analysis: that they are not ruled out is believed to be largely due to poor data quality for deciding between existing models. Recent detections of gravitational waves by the LIGO collaboration enable the first gravitational wave tests of general relativity. An existing test in the literature is used and sped up significantly by a novel method developed in this thesis. The test computes posterior odds ratios, and the new method is shown to compute these accurately and efficiently. Compared to computing evidences, the method presented provides an approximate 100 times reduction in the number of likelihood calculations required to compute evidences at a given accuracy. Further testing may identify a significant advance in Bayesian model selection using nested sampling, as the method is completely general and straightforward to implement. We note that efficiency gains are not guaranteed and may be problem specific: further research is needed.
|
Page generated in 0.0919 seconds