Spelling suggestions: "subject:"sequential fonte carlo"" "subject:"sequential fonte sarlo""
41 |
Particle filters and Markov chains for learning of dynamical systemsLindsten, Fredrik January 2013 (has links)
Sequential Monte Carlo (SMC) and Markov chain Monte Carlo (MCMC) methods provide computational tools for systematic inference and learning in complex dynamical systems, such as nonlinear and non-Gaussian state-space models. This thesis builds upon several methodological advances within these classes of Monte Carlo methods.Particular emphasis is placed on the combination of SMC and MCMC in so called particle MCMC algorithms. These algorithms rely on SMC for generating samples from the often highly autocorrelated state-trajectory. A specific particle MCMC algorithm, referred to as particle Gibbs with ancestor sampling (PGAS), is suggested. By making use of backward sampling ideas, albeit implemented in a forward-only fashion, PGAS enjoys good mixing even when using seemingly few particles in the underlying SMC sampler. This results in a computationally competitive particle MCMC algorithm. As illustrated in this thesis, PGAS is a useful tool for both Bayesian and frequentistic parameter inference as well as for state smoothing. The PGAS sampler is successfully applied to the classical problem of Wiener system identification, and it is also used for inference in the challenging class of non-Markovian latent variable models.Many nonlinear models encountered in practice contain some tractable substructure. As a second problem considered in this thesis, we develop Monte Carlo methods capable of exploiting such substructures to obtain more accurate estimators than what is provided otherwise. For the filtering problem, this can be done by using the well known Rao-Blackwellized particle filter (RBPF). The RBPF is analysed in terms of asymptotic variance, resulting in an expression for the performance gain offered by Rao-Blackwellization. Furthermore, a Rao-Blackwellized particle smoother is derived, capable of addressing the smoothing problem in so called mixed linear/nonlinear state-space models. The idea of Rao-Blackwellization is also used to develop an online algorithm for Bayesian parameter inference in nonlinear state-space models with affine parameter dependencies. / CNDM / CADICS
|
42 |
Monte Carlo methods for sampling high-dimensional binary vectorsSchäfer, Christian 14 November 2012 (has links) (PDF)
This thesis is concerned with Monte Carlo methods for sampling high-dimensional binary vectors from complex distributions of interest. If the state space is too large for exhaustive enumeration, these methods provide a mean of estimating the expected value with respect to some function of interest. Standard approaches are mostly based on random walk type Markov chain Monte Carlo, where the equilibrium distribution of the chain is the distribution of interest and its ergodic mean converges to the expected value. We propose a novel sampling algorithm based on sequential Monte Carlo methodology which copes well with multi-modal problems by virtue of an annealing schedule. The performance of the proposed sequential Monte Carlo sampler depends on the ability to sample proposals from auxiliary distributions which are, in a certain sense, close to the current distribution of interest. The core work of this thesis discusses strategies to construct parametric families for sampling binary vectors with dependencies. The usefulness of this approach is demonstrated in the context of Bayesian variable selection and combinatorial optimization of pseudo-Boolean objective functions.
|
43 |
Calibrating high frequency trading data to agent based models using approximate Bayesian computationGoosen, Kelly 04 August 2021 (has links)
We consider Sequential Monte Carlo Approximate Bayesian Computation (SMC ABC) as a method of calibration for the use of agent based models in market micro-structure. To date, there are no successful calibrations of agent based models to high frequency trading data. Here we test whether a more sophisticated calibration technique, SMC ABC, will achieve this feat on one of the leading agent based models in high frequency trading literature (the Preis-Golke-Paul-Schneider Agent Based Model (Preis et al., 2006)). We find that, although SMC ABC's naive approach of updating distributions can successfully calibrate simple toy models, such as autoregressive moving average models, it fails to calibrate this agent based model for high frequency trading. This may be for two key reasons, either the parameters of the model are not uniquely identifiable given the model output or the SMC ABC rejection mechanism results in information loss rendering parameters unidentifiable given insucient summary statistics.
|
44 |
High dimensional Bayesian computation / Computation bayésienne en grande dimensionBuchholz, Alexander 22 November 2018 (has links)
La statistique bayésienne computationnelle construit des approximations de la distribution a posteriori soit par échantillonnage, soit en construisant des approximations tractables. La contribution de cette thèse au domaine des statistiques bayésiennes est le développement de nouvelle méthodologie en combinant des méthodes existantes. Nos approches sont mieux adaptées à la dimension ou entraînent une réduction du coût de calcul par rapport aux méthodes existantes.Notre première contribution améliore le calcul bayésien approximatif (ABC) en utilisant le quasi-Monte Carlo (QMC). ABC permet l'inférence bayésienne dans les modèles avec une vraisemblance intractable. QMC est une technique de réduction de variance qui fournit des estimateurs plus précis d’intégrales. Notre deuxième contribution utilise le QMC pour l'inférence variationnelle(VI). VI est une méthode pour construire des approximations tractable à la distribution a posteriori . La troisième contribution développe une approche pour adapter les échantillonneurs Monte Carlo séquentiel (SMC) lorsque on utilise des noyaux de mutation Hamiltonian MonteCarlo (HMC). Les échantillonneurs SMC permettent une estimation non biaisée de l’évidence du modèle, mais ils ont tendance à perdre en performance lorsque la dimension croit. HMC est une technique de Monte Carlo par chaîne de Markov qui présente des propriétés intéressantes lorsque la dimension de l'espace cible augmente mais elle est difficile à adapter. En combinant les deux,nous construisons un échantillonneur qui tire avantage des deux. / Computational Bayesian statistics builds approximations to the posterior distribution either bysampling or by constructing tractable approximations. The contribution of this thesis to the fieldof Bayesian statistics is the development of new methodology by combining existing methods. Ourapproaches either scale better with the dimension or result in reduced computational cost com-pared to existing methods. Our first contribution improves approximate Bayesian computation(ABC) by using quasi-Monte Carlo (QMC). ABC allows Bayesian inference in models with in-tractable likelihoods. QMC is a variance reduction technique that yields precise estimations ofintegrals. Our second contribution takes advantage of QMC for Variational Inference (VI). VIis a method for constructing tractable approximations to the posterior distribution. The thirdcontribution develops an approach for tuning Sequential Monte Carlo (SMC) samplers whenusing Hamiltonian Monte Carlo (HMC) mutation kernels. SMC samplers allow the unbiasedestimation of the model evidence but tend to struggle with increasing dimension. HMC is aMarkov chain Monte Carlo technique that has appealing properties when the dimension of thetarget space increases but is difficult to tune. By combining the two we construct a sampler thattakes advantage of the two.
|
45 |
Monte Carlo methods for sampling high-dimensional binary vectors / Monte Carlo séquentiel pour le choix de modèle bayésien : théorie et méthodesSchäfer, Christian 14 November 2012 (has links)
Cette thèse est consacrée à l'étude des méthodes de Monte Carlo pour l'échantillonnage de vecteurs binaires de grande dimension à partir de lois cibles complexes. Si l'espace-état est trop grand pour une énumération exhaustive, ces méthodes permettent d'estimer l’espérance d’une loi donnée par rapport à une fonction d'intérêt. Les approches standards sont principalement basées sur les méthodes Monte Carlo à chaîne de Markov de type marche aléatoire, où la loi stationnaire de la chaîne est la distribution d’intérêt et la moyenne de la trajectoire converge vers l’espérance par le théorème ergodique. Nous proposons un nouvel algorithme d'échantillonnage basé sur les méthodes de Monte Carlo séquentielles qui sont plus robustes au problème de multimodalité grâce à une étape de recuit simulé. La performance de l'échantillonneur de Monte Carlo séquentiel dépend de la capacité d’échantillonner selon des lois auxiliaires qui sont, en un certain sens, proche à la loi de l'intérêt. Le travail principal de cette thèse présente des stratégies visant à construire des familles paramétriques pour l'échantillonnage de vecteurs binaires avec dépendances. L'utilité de cette approche est démontrée dans le cadre de sélection bayésienne de variables et l'optimisation combinatoire des fonctions pseudo-booléennes. / This thesis is concerned with Monte Carlo methods for sampling high-dimensional binary vectors from complex distributions of interest. If the state space is too large for exhaustive enumeration, these methods provide a mean of estimating the expected value with respect to some function of interest. Standard approaches are mostly based on random walk type Markov chain Monte Carlo, where the equilibrium distribution of the chain is the distribution of interest and its ergodic mean converges to the expected value. We propose a novel sampling algorithm based on sequential Monte Carlo methodology which copes well with multi-modal problems by virtue of an annealing schedule. The performance of the proposed sequential Monte Carlo sampler depends on the ability to sample proposals from auxiliary distributions which are, in a certain sense, close to the current distribution of interest. The core work of this thesis discusses strategies to construct parametric families for sampling binary vectors with dependencies. The usefulness of this approach is demonstrated in the context of Bayesian variable selection and combinatorial optimization of pseudo-Boolean objective functions.
|
46 |
Bayesian Analysis and Computational Methods for Dynamic ModelingNiemi, Jarad January 2009 (has links)
<p>Dynamic models, also termed state space models, comprise an extremely rich model class for time series analysis. This dissertation focuses on building state space models for a variety of contexts and computationally efficient methods for Bayesian inference for simultaneous estimation of latent states and unknown fixed parameters.</p><p>Chapter 1 introduces state space models and methods of inference in these models. Chapter 2 describes a novel method for jointly sampling the entire latent state vector in a nonlinear Gaussian state space model using a computationally efficient adaptive mixture modeling procedure. This method is embedded in an overall Markov chain Monte Carlo algorithm for estimating fixed parameters as well as states. In Chapter 3 the method of the previous chapter is implemented in a few illustrative</p><p>nonlinear models and compared to standard existing methods. This chapter also looks at the effect of the number of mixture components as well as length of the time series on the efficiency of the method. I then turn to an biological application in Chapter 4. I discuss modeling choices as well as derivation of the state space model to be used in this application. Parameter and state estimation are analyzed in these models for both simulated and real data. Chapter 5 extends the methodology introduced in Chapter 2 from nonlinear Gaussian models to general state space models. The method is then applied to a financial</p><p>stochastic volatility model on US $ - British £ exchange rates. Bayesian inference in the previous chapter is accomplished through Markov chain Monte Carlo which is suitable for batch analyses, but computationally limiting in sequential analysis. Chapter 6 introduces sequential Monte Carlo. It discusses two methods currently available for simultaneous sequential estimation of latent states and fixed parameters and then introduces a novel algorithm that reduces the key, limiting degeneracy issue while being usable in a wide model class. Chapter 7 implements the novel algorithm in a disease surveillance context modeling influenza epidemics. Finally, Chapter 8 suggests areas for future work in both modeling and Bayesian inference. Several appendices provide detailed technical support material as well as relevant related work.</p> / Dissertation
|
47 |
Sequential Monte Carlo Methods With Applications To Communication ChannelsBoddikurapati, Sirish 2009 December 1900 (has links)
Estimating the state of a system from noisy measurements is a problem which arises in a variety of scientific and industrial areas which include signal processing,
communications, statistics and econometrics. Recursive filtering is one way to achieve this by incorporating noisy observations as they become available with prior knowledge of the system model.
Bayesian methods provide a general framework for dynamic state estimation problems. The central idea behind this recursive Bayesian estimation is computing the probability density function of the state vector of the system conditioned on the measurements. However, the optimal solution to this problem is often intractable
because it requires high-dimensional integration. Although we can use the Kalman
lter in the case of a linear state space model with Gaussian noise, this method is not optimum for a non-linear and non-Gaussian system model. There are many new methods of filtering for the general case. The main emphasis of this thesis is on one such recently developed filter, the particle lter [2,3,6].
In this thesis, a detailed introduction to particle filters is provided as well as some guidelines for the efficient implementation of the particle lter. The application
of particle lters to various communication channels like detection of symbols over
the channels, capacity calculation of the channel are discussed.
|
48 |
On the use of transport and optimal control methods for Monte Carlo simulationHeng, Jeremy January 2016 (has links)
This thesis explores ideas from transport theory and optimal control to develop novel Monte Carlo methods to perform efficient statistical computation. The first project considers the problem of constructing a transport map between two given probability measures. In the Bayesian formalism, this approach is natural when one introduces a curve of probability measures connecting the prior to posterior by tempering the likelihood function. The main idea is to move samples from the prior using an ordinary differential equation (ODE), constructed by solving the Liouville partial differential equation (PDE) which governs the time evolution of measures along the curve. In this work, we first study the regularity solutions of Liouville equation should satisfy to guarantee validity of this construction. We place an emphasis on understanding these issues as it explains the difficulties associated with solutions that have been previously reported. After ensuring that the flow transport problem is well-defined, we give a constructive solution. However, this result is only formal as the representation is given in terms of integrals which are intractable. For computational tractability, we proposed a novel approximation of the PDE which yields an ODE whose drift depends on the full conditional distributions of the intermediate distributions. Even when the ODE is time-discretized and the full conditional distributions are approximated numerically, the resulting distribution of mapped samples can be evaluated and used as a proposal within Markov chain Monte Carlo and sequential Monte Carlo (SMC) schemes. We then illustrate experimentally that the resulting algorithm can outperform state-of-the-art SMC methods at a fixed computational complexity. The second project aims to exploit ideas from optimal control to design more efficient SMC methods. The key idea is to control the proposal distribution induced by a time-discretized Langevin dynamics so as to minimize the Kullback-Leibler divergence of the extended target distribution from the proposal. The optimal value functions of the resulting optimal control problem can then be approximated using algorithms developed in the approximate dynamic programming (ADP) literature. We introduce a novel iterative scheme to perform ADP, provide a theoretical analysis of the proposed algorithm and demonstrate that the latter can provide significant gains over state-of-the-art methods at a fixed computational complexity.
|
49 |
Sequential Monte Carlo methods for probabilistic forecasts and uncertainty assessment in hydrologic modeling / 逐次モンテカルロ法を用いた確率的水文予測と水文モデリングにおける不確実性評価Noh, Seong Jin 23 January 2013 (has links)
Kyoto University (京都大学) / 0048 / 新制・課程博士 / 博士(工学) / 甲第17261号 / 工博第3663号 / 新制||工||1557(附属図書館) / 30018 / 京都大学大学院工学研究科都市環境工学専攻 / (主査)教授 椎葉 充晴, 教授 寶 馨, 准教授 立川 康人 / 学位規則第4条第1項該当
|
50 |
Méthodes de lissage et d'estimation dans des modèles à variables latentes par des méthodes de Monte-Carlo séquentielles / Smoothing and estimation methods in hidden variable models through sequential Monte-Carlo methodsDubarry, Cyrille 09 October 2012 (has links)
Les modèles de chaînes de Markov cachées ou plus généralement ceux de Feynman-Kac sont aujourd'hui très largement utilisés. Ils permettent de modéliser une grande diversité de séries temporelles (en finance, biologie, traitement du signal, ...) La complexité croissante de ces modèles a conduit au développement d'approximations via différentes méthodes de Monte-Carlo, dont le Markov Chain Monte-Carlo (MCMC) et le Sequential Monte-Carlo (SMC). Les méthodes de SMC appliquées au filtrage et au lissage particulaires font l'objet de cette thèse. Elles consistent à approcher la loi d'intérêt à l'aide d'une population de particules définies séquentiellement. Différents algorithmes ont déjà été développés et étudiés dans la littérature. Nous raffinons certains de ces résultats dans le cas du Forward Filtering Backward Smoothing et du Forward Filtering Backward Simulation grâce à des inégalités de déviation exponentielle et à des contrôles non asymptotiques de l'erreur moyenne. Nous proposons également un nouvel algorithme de lissage consistant à améliorer une population de particules par des itérations MCMC, et permettant d'estimer la variance de l'estimateur sans aucune autre simulation. Une partie du travail présenté dans cette thèse concerne également les possibilités de mise en parallèle du calcul des estimateurs particulaires. Nous proposons ainsi différentes interactions entre plusieurs populations de particules. Enfin nous illustrons l'utilisation des chaînes de Markov cachées dans la modélisation de données financières en développant un algorithme utilisant l'Expectation-Maximization pour calibrer les paramètres du modèle exponentiel d'Ornstein-Uhlenbeck multi-échelles / Hidden Markov chain models or more generally Feynman-Kac models are now widely used. They allow the modelling of a variety of time series (in finance, biology, signal processing, ...) Their increasing complexity gave birth to approximations using Monte-Carlo methods, among which Markov Chain Monte-Carlo (MCMC) and Sequential Monte-Carlo (SMC). SMC methods applied to particle filtering and smoothing are dealt with in this thesis. These methods consist in approximating the law of interest through a particle population sequentially defined. Different algorithms have already been developed and studied in the literature. We make some of these results more precise in the particular of the Forward Filtering Backward Smoothing and Forward Filtering Backward Simulation by showing exponential deviation inequalities and by giving non-asymptotic upper bounds to the mean error. We also introduce a new smoothing algorithm improving a particle population through MCMC iterations and allowing to estimate the estimator variance without further simulation. Part of the work presented in this thesis is devoted to the parallel computing of particle estimators. We study different interaction schemes between several particle populations. Finally, we also illustrate the use of hidden Markov chains in the modelling of financial data through an algorithm using Expectation-Maximization to calibrate the exponential Ornstein-Uhlenbeck multiscale stochastic volatility model
|
Page generated in 0.0644 seconds