• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 28
  • 15
  • 7
  • 1
  • 1
  • Tagged with
  • 267
  • 42
  • 32
  • 28
  • 22
  • 20
  • 20
  • 16
  • 15
  • 15
  • 14
  • 14
  • 13
  • 13
  • 12
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

Markov chain Monte Carlo for continuous-time discrete-state systems

Rao, V. A. P. January 2012 (has links)
A variety of phenomena are best described using dynamical models which operate on a discrete state space and in continuous time. Examples include Markov (and semi-Markov) jump processes, continuous-time Bayesian networks, renewal processes and other point processes. These continuous-time, discrete-state models are ideal building blocks for Bayesian models in fields such as systems biology, genetics, chemistry, computing networks, human-computer interactions etc. However, a challenge towards their more widespread use is the computational burden of posterior inference; this typically involves approximations like time discretization and can be computationally intensive. In this thesis, we describe a new class of Markov chain Monte Carlo methods that allow efficient computation while still being exact. The core idea is an auxiliary variable Gibbs sampler that alternately resamples a random discretization of time given the state-trajectory of the system, and then samples a new trajectory given this discretization. We introduce this idea by relating it to a classical idea called uniformization, and use it to develop algorithms that outperform the state-of-the-art for models based on the Markov jump process. We then extend the scope of these samplers to a wider class of models such as nonstationary renewal processes, and semi-Markov jump processes. By developing a more general framework beyond uniformization, we remedy various limitations of the original algorithms, allowing us to develop MCMC samplers for systems with infinite state spaces, unbounded rates, as well as systems indexed by more general continuous spaces than time.
32

Spatio-temporal stochastic hybrid models of biological excitable membranes

Riedler, Martin Georg January 2011 (has links)
A large number of biological systems are intrinsically random, in particular, biological excitable membranes, such as neuronal membranes, cardiac tissue or models for calcium dynamics. The present thesis is concerned with hybrid stochastic models of spatio-temporal dynamics of biological excitable membranes using Piecewise Deterministic Markov Processes (PDMPs). This class of processes allows a precise mathematical description of the internal noise structure of excitable membranes. Overall the aim of the thesis is two-fold: On the one hand, we establish a general hybrid modelling framework for biological excitable membranes and, on the other hand, we are interested in a general advance of PDMP theory which the former necessitates. Regarding the first aim we exemplify the modelling framework on the classical Hodgkin-Huxley model of a squid giant axon. Regarding the latter we present a general PDMP theory incorporating spatial dynamics and present tools for their analysis. Here we focus on two aspects. Firstly, we consider the approximation of PDMPs by deterministic models or continuous stochastic processes. To this end we derive as general theoretical tools a law of large numbers for PDMPs and martingale central limit theorems. The former establishes a connection of stochastic hybrid models to deterministic models given, e.g., by systems of partial differential equations. Whereas the latter connects the stochastic fluctuations in the hybrid models to diffusion processes. Furthermore, these limit theorems provide the basis for a general Langevin approximation to PDMPs, i.e., certain stochastic partial differential equations that are expected to be similar in their dynamics to PDMPs. Secondly, we also address the question of numerical simulation of PDMPs. We present and analyse the convergence in the pathwise sense of a class of simulation methods for PDMPs in Euclidean space.
33

Optimization problems in discrete and continuous time

Janiszewski, Szymon Pawel January 2001 (has links)
No description available.
34

Some contributions to the theory and methodology of Markov chain Monte Carlo

Livingstone, S. J. January 2016 (has links)
The general theme of this thesis is developing a better understanding of some Markov chain Monte Carlo methods. We review the literature in Chapters 1-4, including a short discussion of geometry in Markov chain Monte Carlo. In Chapter 5 we consider Langevin diffusions. First, a new class of these are derived in which the volatility is made position-dependent, using tools from stochastic analysis. Second, a complementary derivation is given, here using tools from Riemannian geometry. We hope that this work will help develop understanding of the geometric perspective among statisticians. Such derivations have been attempted previously, but solutions were not correct in general. We highlight these issues in detail. In the final part discussion is given on the use of these objects in Markov chain Monte Carlo. In Chapter 6 we consider a Metropolis-Hastings method with proposal kernel N(x,hV(x)), where x is the current state. After reviewing instances in the literature, we analyse the ergodicity properties of the resulting Markov chains. In one dimension we find that suitable choice of V(x) can change these compared to the Random Walk Metropolis case N(x,hS), for better or worse. In higher dimensions we show that judicious choice of V(x) can produce a geometrically converging chain when probability concentrates on an ever narrower ridge as |x| grows, something which is not true for the Random Walk Metropolis. In Chapter 7 we discuss stability of Hamiltonian Monte Carlo. For a fixed integration time we establish conditions for irreducibility and geometric ergodicity. Some results are confined to one dimension, and some further to a reference class of distributions. We find that target distributions with tails that are in between Exponential and Gaussian are needed for geometric ergodicity. Next we consider changing integration times, and show that here a geometrically ergodic chain can be constructed when tails are heavier than Exponential.
35

Model fit diagnostics for hidden Markov models

Kadhem, Safaa K. January 2017 (has links)
Hidden Markov models (HMMs) are an efficient tool to describe and model the underlying behaviour of many phenomena. HMMs assume that the observed data are generated independently from a parametric distribution, conditional on an unobserved process that satisfies the Markov property. The model selection or determining the number of hidden states for these models is an important issue which represents the main interest of this thesis. Applying likelihood-based criteria for HMMs is a challenging task as the likelihood function of these models is not available in a closed form. Using the data augmentation approach, we derive two forms of the likelihood function of a HMM in closed form, namely the observed and the conditional likelihoods. Subsequently, we develop several modified versions of the Akaike information criterion (AIC) and Bayesian information criterion (BIC) approximated under the Bayesian principle. We also develop several versions for the deviance information criterion (DIC). These proposed versions are based on the type of likelihood, i.e. conditional or observed likelihood, and also on whether the hidden states are dealt with as missing data or additional parameters in the model. This latter point is referred to as the concept of focus. Finally, we consider model selection from a predictive viewpoint. To this end, we develop the so-called widely applicable information criterion (WAIC). We assess the performance of these various proposed criteria via simulation studies and real-data applications. In this thesis, we apply Poisson HMMs to model the spatial dependence analysis in count data via an application to traffic safety crashes for three highways in the UK. The ultimate interest is in identifying highway segments which have distinctly higher crash rates. Selecting an optimal number of states is an important part of the interpretation. For this purpose, we employ model selection criteria to determine the optimal number of states. We also use several goodness-of-fit checks to assess the model fitted to the data. We implement an MCMC algorithm and check its convergence. We examine the sensitivity of the results to the prior specification, a potential problem given small sample sizes. The Poisson HMMs adopted can provide a different model for analysing spatial dependence on networks. It is possible to identify segments with a higher posterior probability of classification in a high risk state, a task that could prioritise management action.
36

Representation and estimation of stochastic populations

Houssineau, Jérémie January 2015 (has links)
This work is concerned with the representation and the estimation of populations composed of an uncertain and varying number of individuals which can randomly evolve in time. The existing solutions that address this type of problems make the assumption that all or none of the individuals are distinguishable. In other words, the focus is either on specific individuals or on the population as a whole. Theses approaches have complimentary advantages and drawbacks and the main objective in this work is to introduce a suitable representation for partially-indistinguishable populations. In order to fulfil this objective, a sufficiently versatile way of quantifying different types of uncertainties has to be studied. It is demonstrated that this can be achieved within a measure-theoretic Bayesian paradigm. The proposed representation of stochastic populations is then used for the introduction of various filtering algorithms from the most general to the most specific. The modelling possibilities and the accuracy of one of these filters are then demonstrated in different situations.
37

Some problems in the mathematical theory of probability

Foster, Frederic Gordon January 1952 (has links)
No description available.
38

Prediction and simulation of stochastic time series using a non-linear learning filter

Elson, Donald George January 1963 (has links)
The thesis is concerned with a specialised analogue computer which computes optimum, non-linear transfer functions using a learning technique. The machine can be used to solve filtering* prediction or simulation problems. The completion of its construction is described, and the results of several experiments on the computer, including a study of ship-motion prediction, are given. Some theoretical analysis of the method is presented together with suggestions for future research and development.
39

Superprocesses and Large-Scale Networks

Eckhoff, Maren January 2014 (has links)
The main theme of this thesis is the use of the branching property in the analysis of random structures. The thesis consists of two self-contained parts. In the first part, we study the long-term behaviour of supercritical superdiffusions and prove the strong law of large numbers. The key tools are spine and skeleton decompositions, and the analysis of the corresponding diffusions and branching particle diffusions. In the second part, we consider preferential attachment networks and quantify their vulnerability to targeted attacks. Despite the very involved global topology, locally the network can be approximated by a multitype branching random walk with two killing boundaries. Our arguments exploit this connection.
40

Fragmentation-coalescence processes : theory and applications

Pagett, Steven January 2017 (has links)
The main objects of study in this thesis are fragmentation-coalescence processes, where particles are grouped into clusters and evolve by either joining together, to form larger clusters, or splitting apart, to form smaller clusters. The focus is on the number of these clusters and the distribution of their sizes. In particular, we show for a certain class of processes defined on a finite system that there is convergence in the thermodynamic limit to an infinite system. For a second class of processes we show there is a phase transition between regimes where the number of clusters has an entrance law from ∞ or not.

Page generated in 0.0388 seconds