• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 13
  • 3
  • 2
  • Tagged with
  • 22
  • 22
  • 22
  • 11
  • 11
  • 10
  • 8
  • 6
  • 5
  • 5
  • 5
  • 5
  • 5
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Small-variance asymptotics for Bayesian neural networks

Sankarapandian, Sivaramakrishnan 03 July 2018 (has links)
Bayesian neural networks (BNNs) are a rich and flexible class of models that have several advantages over standard feedforward networks, but are typically expensive to train on large-scale data. In this thesis, we explore the use of small-variance asymptotics-an approach to yielding fast algorithms from probabilistic models-on various Bayesian neural network models. We first demonstrate how small-variance asymptotics shows precise connections between standard neural networks and BNNs; for example, particular sampling algorithms for BNNs reduce to standard backpropagation in the small-variance limit. We then explore a more complex BNN where the number of hidden units is additionally treated as a random variable in the model. While standard sampling schemes would be too slow to be practical, our asymptotic approach yields a simple method for extending standard backpropagation to the case where the number of hidden units is not fixed. We show on several data sets that the resulting algorithm has benefits over backpropagation on networks with a fixed architecture. / 2019-01-02T00:00:00Z
2

Métodos de Monte Carlo Hamiltoniano aplicados em modelos GARCH / Hamiltonian Monte Carlo methods in GARCH models

Xavier, Cleber Martins 26 April 2019 (has links)
Uma das informações mais importantes no mercado financeiro é a variabilidade de um ativo. Diversos modelos foram propostos na literatura com o intuito de avaliar este fenômeno. Dentre eles podemos destacar os modelos GARCH. Este trabalho propõe o uso do método Monte Carlo Hamiltoniano (HMC) para a estimação dos parâmetros do modelo GARCH univariado e multivariado. Estudos de simulação são realizados e as estimativas comparadas com o método de estimação Metropolis-Hastings presente no pacote BayesDccGarch. Além disso, compara-se os resultados do método HMC com a metodologia adotada no pacote rstan. Por fim, é realizado uma aplicação a dados reais utilizando o DCC-GARCH bivariado e os métodos de estimação HMC e Metropolis-Hastings. / One of the most important informations in financial market is variability of an asset. Several models have been proposed in literature with a view of to evaluate this phenomenon. Among them we have the GARCH models. This paper use Hamiltonian Monte Carlo (HMC) methods for estimation of parameters univariate and multivariate GARCH models. Simulation studies are performed and the estimatives compared with Metropolis-Hastings methods of the BayesDcc- Garch package. Also, we compared the results of HMC method with the methodology present in rstan package. Finally, a application with real data is performed using bivariate DCC-GARCH and the methods of estimation HMC and Metropolis-Hastings.
3

Auxiliary variable Markov chain Monte Carlo methods

Graham, Matthew McKenzie January 2018 (has links)
Markov chain Monte Carlo (MCMC) methods are a widely applicable class of algorithms for estimating integrals in statistical inference problems. A common approach in MCMC methods is to introduce additional auxiliary variables into the Markov chain state and perform transitions in the joint space of target and auxiliary variables. In this thesis we consider novel methods for using auxiliary variables within MCMC methods to allow approximate inference in otherwise intractable models and to improve sampling performance in models exhibiting challenging properties such as multimodality. We first consider the pseudo-marginal framework. This extends the Metropolis–Hastings algorithm to cases where we only have access to an unbiased estimator of the density of target distribution. The resulting chains can sometimes show ‘sticking’ behaviour where long series of proposed updates are rejected. Further the algorithms can be difficult to tune and it is not immediately clear how to generalise the approach to alternative transition operators. We show that if the auxiliary variables used in the density estimator are included in the chain state it is possible to use new transition operators such as those based on slice-sampling algorithms within a pseudo-marginal setting. This auxiliary pseudo-marginal approach leads to easier to tune methods and is often able to improve sampling efficiency over existing approaches. As a second contribution we consider inference in probabilistic models defined via a generative process with the probability density of the outputs of this process only implicitly defined. The approximate Bayesian computation (ABC) framework allows inference in such models when conditioning on the values of observed model variables by making the approximation that generated observed variables are ‘close’ rather than exactly equal to observed data. Although making the inference problem more tractable, the approximation error introduced in ABC methods can be difficult to quantify and standard algorithms tend to perform poorly when conditioning on high dimensional observations. This often requires further approximation by reducing the observations to lower dimensional summary statistics. We show how including all of the random variables used in generating model outputs as auxiliary variables in a Markov chain state can allow the use of more efficient and robust MCMC methods such as slice sampling and Hamiltonian Monte Carlo (HMC) within an ABC framework. In some cases this can allow inference when conditioning on the full set of observed values when standard ABC methods require reduction to lower dimensional summaries for tractability. Further we introduce a novel constrained HMC method for performing inference in a restricted class of differentiable generative models which allows conditioning the generated observed variables to be arbitrarily close to observed data while maintaining computational tractability. As a final topicwe consider the use of an auxiliary temperature variable in MCMC methods to improve exploration of multimodal target densities and allow estimation of normalising constants. Existing approaches such as simulated tempering and annealed importance sampling use temperature variables which take on only a discrete set of values. The performance of these methods can be sensitive to the number and spacing of the temperature values used, and the discrete nature of the temperature variable prevents the use of gradient-based methods such as HMC to update the temperature alongside the target variables. We introduce new MCMC methods which instead use a continuous temperature variable. This both removes the need to tune the choice of discrete temperature values and allows the temperature variable to be updated jointly with the target variables within a HMC method.
4

Inverse analysis in geomechanical problems using Hamiltonian Monte Carlo / Hamiltonian Monte Carloを用いた地盤力学問題における逆解析

Koch, Michael Conrad 23 March 2020 (has links)
京都大学 / 0048 / 新制・課程博士 / 博士(農学) / 甲第22514号 / 農博第2418号 / 新制||農||1078(附属図書館) / 学位論文||R2||N5294(農学部図書室) / 京都大学大学院農学研究科地域環境科学専攻 / (主査)教授 村上 章, 教授 藤原 正幸, 教授 磯 祐介 / 学位規則第4条第1項該当 / Doctor of Agricultural Science / Kyoto University / DGAM
5

Modern Monte Carlo Methods and Their Application in Semiparametric Regression

Thomas, Samuel Joseph 05 1900 (has links)
Indiana University-Purdue University Indianapolis (IUPUI) / The essence of Bayesian data analysis is to ascertain posterior distributions. Posteriors generally do not have closed-form expressions for direct computation in practical applications. Analysts, therefore, resort to Markov Chain Monte Carlo (MCMC) methods for the generation of sample observations that approximate the desired posterior distribution. Standard MCMC methods simulate sample values from the desired posterior distribution via random proposals. As a result, the mechanism used to generate the proposals inevitably determines the efficiency of the algorithm. One of the modern MCMC techniques designed to explore the high-dimensional space more efficiently is Hamiltonian Monte Carlo (HMC), based on the Hamiltonian differential equations. Inspired by classical mechanics, these equations incorporate a latent variable to generate MCMC proposals that are likely to be accepted. This dissertation discusses how such a powerful computational approach can be used for implementing statistical models. Along this line, I created a unified computational procedure for using HMC to fit various types of statistical models. The procedure that I proposed can be applied to a broad class of models, including linear models, generalized linear models, mixed-effects models, and various types of semiparametric regression models. To facilitate the fitting of a diverse set of models, I incorporated new parameterization and decomposition schemes to ensure the numerical performance of Bayesian model fitting without sacrificing the procedure’s general applicability. As a concrete application, I demonstrate how to use the proposed procedure to fit a multivariate generalized additive model (GAM), a nonstandard statistical model with a complex covariance structure and numerous parameters. Byproducts of the research include two software packages that all practical data analysts to use the proposed computational method to fit their own models. The research’s main methodological contribution is the unified computational approach that it presents for Bayesian model fitting that can be used for standard and nonstandard statistical models. Availability of such a procedure has greatly enhanced statistical modelers’ toolbox for implementing new and nonstandard statistical models.
6

Multilevel Bayesian Joint Model in Hierarchically Structured Data

Zhou, Chen (Grace) 23 August 2022 (has links)
No description available.
7

Advanced Sampling Methods for Solving Large-Scale Inverse Problems

Attia, Ahmed Mohamed Mohamed 19 September 2016 (has links)
Ensemble and variational techniques have gained wide popularity as the two main approaches for solving data assimilation and inverse problems. The majority of the methods in these two approaches are derived (at least implicitly) under the assumption that the underlying probability distributions are Gaussian. It is well accepted, however, that the Gaussianity assumption is too restrictive when applied to large nonlinear models, nonlinear observation operators, and large levels of uncertainty. This work develops a family of fully non-Gaussian data assimilation algorithms that work by directly sampling the posterior distribution. The sampling strategy is based on a Hybrid/Hamiltonian Monte Carlo (HMC) approach that can handle non-normal probability distributions. The first algorithm proposed in this work is the "HMC sampling filter", an ensemble-based data assimilation algorithm for solving the sequential filtering problem. Unlike traditional ensemble-based filters, such as the ensemble Kalman filter and the maximum likelihood ensemble filter, the proposed sampling filter naturally accommodates non-Gaussian errors and nonlinear model dynamics, as well as nonlinear observations. To test the capabilities of the HMC sampling filter numerical experiments are carried out using the Lorenz-96 model and observation operators with different levels of nonlinearity and differentiability. The filter is also tested with shallow water model on the sphere with linear observation operator. Numerical results show that the sampling filter performs well even in highly nonlinear situations where the traditional filters diverge. Next, the HMC sampling approach is extended to the four-dimensional case, where several observations are assimilated simultaneously, resulting in the second member of the proposed family of algorithms. The new algorithm, named "HMC sampling smoother", is an ensemble-based smoother for four-dimensional data assimilation that works by sampling from the posterior probability density of the solution at the initial time. The sampling smoother naturally accommodates non-Gaussian errors and nonlinear model dynamics and observation operators, and provides a full description of the posterior distribution. Numerical experiments for this algorithm are carried out using a shallow water model on the sphere with observation operators of different levels of nonlinearity. The numerical results demonstrate the advantages of the proposed method compared to the traditional variational and ensemble-based smoothing methods. The HMC sampling smoother, in its original formulation, is computationally expensive due to the innate requirement of running the forward and adjoint models repeatedly. The proposed family of algorithms proceeds by developing computationally efficient versions of the HMC sampling smoother based on reduced-order approximations of the underlying model dynamics. The reduced-order HMC sampling smoothers, developed as extensions to the original HMC smoother, are tested numerically using the shallow-water equations model in Cartesian coordinates. The results reveal that the reduced-order versions of the smoother are capable of accurately capturing the posterior probability density, while being significantly faster than the original full order formulation. In the presence of nonlinear model dynamics, nonlinear observation operator, or non-Gaussian errors, the prior distribution in the sequential data assimilation framework is not analytically tractable. In the original formulation of the HMC sampling filter, the prior distribution is approximated by a Gaussian distribution whose parameters are inferred from the ensemble of forecasts. The Gaussian prior assumption in the original HMC filter is relaxed. Specifically, a clustering step is introduced after the forecast phase of the filter, and the prior density function is estimated by fitting a Gaussian Mixture Model (GMM) to the prior ensemble. The base filter developed following this strategy is named cluster HMC sampling filter (ClHMC ). A multi-chain version of the ClHMC filter, namely MC-ClHMC , is also proposed to guarantee that samples are taken from the vicinities of all probability modes of the formulated posterior. These methodologies are tested using a quasi-geostrophic (QG) model with double-gyre wind forcing and bi-harmonic friction. Numerical results demonstrate the usefulness of using GMMs to relax the Gaussian prior assumption in the HMC filtering paradigm. To provide a unified platform for data assimilation research, a flexible and a highly-extensible testing suite, named DATeS , is developed and described in this work. The core of DATeS is implemented in Python to enable for Object-Oriented capabilities. The main components, such as the models, the data assimilation algorithms, the linear algebra solvers, and the time discretization routines are independent of each other, such as to offer maximum flexibility to configure data assimilation studies. / Ph. D.
8

Hamiltonian Monte Carlo for Reconstructing Historical Earthquake-Induced Tsunamis

Callahan, Jacob Paul 07 June 2023 (has links) (PDF)
In many areas of the world, seismic hazards pose a great risk to both human and natural populations. In particular, earthquake-induced tsunamis are especially dangerous to many areas in the Pacific. The study and quantification of these seismic events can both help scientists better understand how these natural hazards occur and help at-risk populations make better preparations for these events. However, many events of interest occurred too long ago to be recorded by modern instruments, so data on these earthquakes are sparse and unreliable. To remedy this, a Bayesian method for reconstructing the source earthquakes for these historical tsunamis based on anecdotal data, called TsunamiBayes, has been developed and used to study historical events that occurred in 1852 and 1820. One drawback of this method is the computational cost to reconstruct posterior distributions on tsunami source parameters. In this work, we improve on the TsunamiBayes method by introducing higher-order MCMC methods, specifically the Hamiltonian Monte Carlo (HMC) method to increase sample acceptance rate and therefore reduce computation time. Unfortunately the exact gradient for this problem is not available, and so we make use of a surrogate gradient via a neural network fitted to the forward model. We examine the effects of this surrogate gradient HMC sampling method on the posterior distribution for an 1852 event in the Banda Sea, compare results to previous results collected usisng random walk, and note the benefits of the surrogate gradient in this context.
9

Analysing Regime-Switching and Cointegration with Hamiltonian Monte Carlo

Brandt, Jakob January 2023 (has links)
The statistical analysis of cointegration is crucial for inferring shared stochastic trends between variables and is an important area of Econometrics for analyzing long-term equilibriums in the economy. Bayesian inference of cointegration involves the identification of cointegrating vectors that are determined up to arbitrary linear combinations, for which the Gibbs sampler is often used to simulate draws from the posterior distribution. However, economic theory may not suggest linear relations and regime-switching models can be used to account for non-linearity. Modeling cointegration and regime-switching as well as the combination of them are associated with highly parameterized models that can prove to be difficult for Markov Chain Monte Carlo techniques such as the Gibbs sampler. Hamiltonian Monte Carlo, which aims at efficiently exploring the posterior distribution, may thus facilitate these difficulties. Furthermore, posterior distributions with highly varying curvature in their geometries can be adequately monitored by Hamiltonian Monte Carlo. The aim of the thesis is to analyze how Hamiltonian Monte Carlo performs in simulating draws from the posterior distributions of models accounting for cointegration and regime-switching. The results suggest that while it is not necessarily the case that regime-switching will be identified, Hamiltonian Monte Carlo performs well in exploring the posterior distribution. However, high rates of divergences from the true Hamiltonian trajectory reduce the algorithm to a Random Walk to some extent, limiting the efficiency of the sampling.
10

Efficient Sampling of Gaussian Processes under Linear Inequality Constraints

Brahmantio, Bayu Beta January 2021 (has links)
In this thesis, newer Markov Chain Monte Carlo (MCMC) algorithms are implemented and compared in terms of their efficiency in the context of sampling from Gaussian processes under linear inequality constraints. Extending the framework of Gaussian process that uses Gibbs sampler, two MCMC algorithms, Exact Hamiltonian Monte Carlo (HMC) and Analytic Elliptical Slice Sampling (ESS), are used to sample values of truncated multivariate Gaussian distributions that are used for Gaussian process regression models with linear inequality constraints. In terms of generating samples from Gaussian processes under linear inequality constraints, the proposed methods generally produce samples that are less correlated than samples from the Gibbs sampler. Time-wise, Analytic ESS is proven to be a faster choice while Exact HMC produces the least correlated samples.

Page generated in 0.0631 seconds