• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 41
  • 7
  • 5
  • 2
  • 1
  • Tagged with
  • 69
  • 69
  • 69
  • 25
  • 24
  • 18
  • 16
  • 15
  • 14
  • 14
  • 11
  • 10
  • 10
  • 9
  • 9
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

Une approche Bayésienne pour l'optimisation multi-objectif sous contraintes / A Bayesian approach to constrained multi-objective optimization

Feliot, Paul 12 July 2017 (has links)
Ces travaux de thèse portent sur l'optimisation multi-objectif de fonctions à valeurs réelles sous contraintes d'inégalités. En particulier, nous nous intéressons à des problèmes pour lesquels les fonctions objectifs et contraintes sont évaluées au moyen d'un programme informatique nécessitant potentiellement plusieurs heures de calcul pour retourner un résultat. Dans ce cadre, il est souhaitable de résoudre le problème d'optimisation en utilisant le moins possible d'appels au code de calcul. Afin de résoudre ce problème, nous proposons dans cette thèse un algorithme d'optimisation Bayésienne baptiséBMOO. Cet algorithme est fondé sur un nouveau critère d'amélioration espérée construit afin d'être applicable à des problèmes fortement contraints et/ou avecde nombreux objectifs. Ce critère s'appuie sur une fonction de perte mesurant le volume de l'espace dominé par les observations courantes, ce dernier étant défini au moyen d'une règle de domination étendue permettant de comparer des solutions potentielles à la fois selon les valeurs des objectifs et des contraintes qui leurs sont associées. Le critère ainsi défini généralise plusieurs critères classiques d'amélioration espérée issus de la littérature. Il prend la forme d'une intégrale définie sur l'espace des objectifs et des contraintes pour laquelle aucune forme fermée n'est connue dans leas général. De plus, il doit être optimisé à chaque itération de l'algorithme.Afin de résoudre ces difficultés, des algorithmes de Monte-Carlo séquentiel sont également proposés. L'efficacité de BMOO est illustrée à la fois sur des cas tests académiques et sur quatre problèmes d'optimisation représentant de réels problèmes de conception. / In this thesis, we address the problem of the derivative-free multi-objective optimization of real-valued functions subject to multiple inequality constraints. In particular, we consider a setting where the objectives and constraints of the problem are evaluated simultaneously using a potentially time-consuming computer program. To solve this problem, we propose a Bayesian optimization algorithm called BMOO. This algorithm implements a new expected improvement sampling criterion crafted to apply to potentially heavily constrained problems and to many-objective problems. This criterion stems from the use of the hypervolume of the dominated region as a loss function, where the dominated region is defined using an extended domination rule that applies jointly on the objectives and constraints. Several criteria from the Bayesian optimization literature are recovered as special cases. The criterion takes the form of an integral over the space of objectives and constraints for which no closed form expression exists in the general case. Besides, it has to be optimized at every iteration of the algorithm. To solve these difficulties, specific sequential Monte-Carlo algorithms are also proposed. The effectiveness of BMOO is shown on academic test problems and on four real-life design optimization problems.
32

Sequential Monte Carlo Parameter Estimation for Differential Equations

Arnold, Andrea 11 June 2014 (has links)
No description available.
33

Maximum Likelihood Estimation for Stochastic Differential Equations Using Sequential Kriging-Based Optimization

Schneider, Grant W. January 2014 (has links)
No description available.
34

Bayesian estimation by sequential Monte Carlo sampling for nonlinear dynamic systems

Chen, Wen-shiang 17 June 2004 (has links)
No description available.
35

Advancing Sequential Monte Carlo For Model Checking, Prior Smoothing And Applications In Engineering And Science

Lang, Lixin 19 March 2008 (has links)
No description available.
36

Bayesian stochastic differential equation modelling with application to finance

Al-Saadony, Muhannad January 2013 (has links)
In this thesis, we consider some popular stochastic differential equation models used in finance, such as the Vasicek Interest Rate model, the Heston model and a new fractional Heston model. We discuss how to perform inference about unknown quantities associated with these models in the Bayesian framework. We describe sequential importance sampling, the particle filter and the auxiliary particle filter. We apply these inference methods to the Vasicek Interest Rate model and the standard stochastic volatility model, both to sample from the posterior distribution of the underlying processes and to update the posterior distribution of the parameters sequentially, as data arrive over time. We discuss the sensitivity of our results to prior assumptions. We then consider the use of Markov chain Monte Carlo (MCMC) methodology to sample from the posterior distribution of the underlying volatility process and of the unknown model parameters in the Heston model. The particle filter and the auxiliary particle filter are also employed to perform sequential inference. Next we extend the Heston model to the fractional Heston model, by replacing the Brownian motions that drive the underlying stochastic differential equations by fractional Brownian motions, so allowing a richer dependence structure across time. Again, we use a variety of methods to perform inference. We apply our methodology to simulated and real financial data with success. We then discuss how to make forecasts using both the Heston and the fractional Heston model. We make comparisons between the models and show that using our new fractional Heston model can lead to improve forecasts for real financial data.
37

On auxiliary variables and many-core architectures in computational statistics

Lee, Anthony January 2011 (has links)
Emerging many-core computer architectures provide an incentive for computational methods to exhibit specific types of parallelism. Our ability to perform inference in Bayesian statistics is often dependent upon our ability to approximate expectations of functions of random variables, for which Monte Carlo methodology provides a general purpose solution using a computer. This thesis is primarily concerned with exploring the gains that can be obtained by using many-core architectures to accelerate existing population-based Monte Carlo algorithms, as well as providing a novel general framework that can be used to devise new population-based methods. Monte Carlo algorithms are often concerned with sampling random variables taking values in X whose density is known up to a normalizing constant. Population-based methods typically make use of collections of interacting auxiliary random variables, each of which is in X, in specifying an algorithm. Such methods are good candidates for parallel implementation when the collection of samples can be generated in parallel and their interaction steps are either parallelizable or negligible in cost. The first contribution of this thesis is in demonstrating the potential speedups that can be obtained for two common population-based methods, population-based Markov chain Monte Carlo (MCMC) and sequential Monte Carlo (SMC). The second contribution of this thesis is in the derivation of a hierarchical family of sparsity-inducing priors in regression and classification settings. Here, auxiliary variables make possible the implementation of a fast algorithm for finding local modes of the posterior density. SMC, accelerated on a many-core architecture, is then used to perform inference for a range of prior specifications to gain an understanding of sparse association signal in the context of genome-wide association studies. The third contribution is in the use of a new perspective on reversible MCMC kernels that allows for the construction of novel population-based methods. These methods differ from most existing methods in that one can make the resulting kernels define a Markov chain on X. A further development is that one can define kernels in which the number of auxiliary variables is given a distribution conditional on the values of the auxiliary variables obtained so far. This is perhaps the most important methodological contribution of the thesis, and the adaptation of the number of particles used within a particle MCMC algorithm provides a general purpose algorithm for sampling from a variety of complex distributions.
38

Sekvenční metody Monte Carlo / Sekvenční metody Monte Carlo

Coufal, David January 2013 (has links)
Title: Sequential Monte Carlo Methods Author: David Coufal Department: Department of Probability and Mathematical Statistics Supervisor: prof. RNDr. Viktor Beneš, DrSc. Abstract: The thesis summarizes theoretical foundations of sequential Monte Carlo methods with a focus on the application in the area of particle filters; and basic results from the theory of nonparametric kernel density estimation. The summary creates the basis for investigation of application of kernel meth- ods for approximation of densities of distributions generated by particle filters. The main results of the work are the proof of convergence of kernel estimates to related theoretical densities and the specification of the development of approx- imation error with respect to time evolution of a filter. The work is completed by an experimental part demonstrating the work of presented algorithms by simulations in the MATLABR⃝ computational environment. Keywords: sequential Monte Carlo methods, particle filters, nonparametric kernel estimates
39

Approximate Bayesian Computation for Complex Dynamic Systems

Bonassi, Fernando Vieira January 2013 (has links)
<p>This thesis focuses on the development of ABC methods for statistical modeling in complex dynamic systems. Motivated by real applications in biology, I propose computational strategies for Bayesian inference in contexts where standard Monte Carlo methods cannot be directly applied due to the high complexity of the dynamic model and/or data limitations.</p><p> Chapter 2 focuses on stochastic bionetwork models applied to data generated from the marginal distribution of a few network nodes at snapshots in time. I present a Bayesian computational strategy, coupled with an approach to summarizing and numerically characterizing biological phenotypes that are represented in terms of the resulting sample distributions of cellular markers. ABC and mixture modeling are used to define the approach to linking mechanistic mathematical models of network dynamics to snapshot data, using a toggle switch example integrating simulated and real data as context. </p><p> Chapter 3 focuses on the application of the methodology presented in Chapter 2 to the Myc/Rb/E2F network. This network involves a relatively high number of parameters and stochastic equations in the model specification and, thus, is substantially more complex than the toggle switch example. The analysis of the Myc/Rb/E2F network is performed with simulated and real data. I demonstrate that the proposed method can indicate which parameters can be learned about using the marginal data. </p><p> In Chapter 4, I present an ABC SMC method that uses data-based adaptive weights. This easily implemented and computationally trivial extension of ABC SMC can substantially improve acceptance rates. This is demonstrated through a series of examples with simulated and real data, including the toggle switch example. Theoretical justification is also provided to explain why this method is expected to improve the effectiveness of ABC SMC.</p><p> In Chapter 5, I present an integrated Bayesian computational strategy for fitting complex dynamic models to sparse time-series data. This is applied to experimental data from an immunization response study with Indian Rhesus macaques. The computational strategy consists of two stages: first, MCMC is implemented based on simplified sampling steps, and then, the resulting approximate output is used to generate a proposal distribution for the parameters that results in an efficient ABC procedure. The incorporation of ABC as a correction tool improves the model fit, as is demonstrated through predictive posterior analysis on the data sets of the study.</p><p> Chapter 6 presents additional discussion and comments on potential future research directions.</p> / Dissertation
40

Advances in Bayesian Modelling and Computation: Spatio-Temporal Processes, Model Assessment and Adaptive MCMC

Ji, Chunlin January 2009 (has links)
<p>The modelling and analysis of complex stochastic systems with increasingly large data sets, state-spaces and parameters provides major stimulus to research in Bayesian nonparametric methods and Bayesian computation. This dissertation presents advances in both nonparametric modelling and statistical computation stimulated by challenging problems of analysis in complex spatio-temporal systems and core computational issues in model fitting and model assessment. The first part of the thesis, represented by chapters 2 to 4, concerns novel, nonparametric Bayesian mixture models for spatial point processes, with advances in modelling, computation and applications in biological contexts. Chapter 2 describes and develops models for spatial point processes in which the point outcomes are latent, where indirect observations related to the point outcomes are available, and in which the underlying spatial intensity functions are typically highly heterogenous. Spatial intensities of inhomogeneous Poisson processes are represented via flexible nonparametric Bayesian mixture models. Computational approaches are presented for this new class of spatial point process mixtures and extended to the context of unobserved point process outcomes. Two examples drawn from a central, motivating context, that of immunofluorescence histology analysis in biological studies generating high-resolution imaging data, demonstrate the modelling approach and computational methodology. Chapters 3 and 4 extend this framework to define a class of flexible Bayesian nonparametric models for inhomogeneous spatio-temporal point processes, adding dynamic models for underlying intensity patterns. Dependent Dirichlet process mixture models are introduced as core components of this new time-varying spatial model. Utilizing such nonparametric mixture models for the spatial process intensity functions allows the introduction of time variation via dynamic, state-space models for parameters characterizing the intensities. Bayesian inference and model-fitting is addressed via novel particle filtering ideas and methods. Illustrative simulation examples include studies in problems of extended target tracking and substantive data analysis in cell fluorescent microscopic imaging tracking problems.</p><p>The second part of the thesis, consisting of chapters 5 and chapter 6, concerns advances in computational methods for some core and generic Bayesian inferential problems. Chapter 5 develops a novel approach to estimation of upper and lower bounds for marginal likelihoods in Bayesian modelling using refinements of existing variational methods. Traditional variational approaches only provide lower bound estimation; this new lower/upper bound analysis is able to provide accurate and tight bounds in many problems, so facilitates more reliable computation for Bayesian model comparison while also providing a way to assess adequacy of variational densities as approximations to exact, intractable posteriors. The advances also include demonstration of the significant improvements that may be achieved in marginal likelihood estimation by marginalizing some parameters in the model. A distinct contribution to Bayesian computation is covered in Chapter 6. This concerns a generic framework for designing adaptive MCMC algorithms, emphasizing the adaptive Metropolized independence sampler and an effective adaptation strategy using a family of mixture distribution proposals. This work is coupled with development of a novel adaptive approach to computation in nonparametric modelling with large data sets; here a sequential learning approach is defined that iteratively utilizes smaller data subsets. Under the general framework of importance sampling based marginal likelihood computation, the proposed adaptive Monte Carlo method and sequential learning approach can facilitate improved accuracy in marginal likelihood computation. The approaches are exemplified in studies of both synthetic data examples, and in a real data analysis arising in astro-statistics.</p><p>Finally, chapter 7 summarizes the dissertation and discusses possible extensions of the specific modelling and computational innovations, as well as potential future work.</p> / Dissertation

Page generated in 0.1011 seconds