Spelling suggestions: "subject:"bayesian"" "subject:"eayesian""
451 |
Dynamic Bayesian models for modelling environmental space-time fieldsDou, Yiping 05 1900 (has links)
This thesis addresses spatial interpolation and temporal prediction using air pollution data by several space-time modelling approaches. Firstly, we implement the dynamic linear modelling (DLM) approach in spatial interpolation and find various potential
problems with that approach. We develop software to implement our approach. Secondly, we implement a Bayesian spatial prediction (BSP) approach to model spatio-temporal ground-level ozone fields and compare the accuracy of that approach with that of the DLM. Thirdly, we develop a Bayesian version empirical orthogonal function (EOF) method to incorporate the uncertainties due to temporally varying spatial process, and the spatial variations at broad- and fine-
scale. Finally, we extend the BSP into the DLM framework to develop a unified Bayesian spatio-temporal model for univariate and
multivariate responses. The result generalizes a number of current approaches in this field.
|
452 |
Dynamic Bayesian models for modelling environmental space-time fieldsDou, Yiping 05 1900 (has links)
This thesis addresses spatial interpolation and temporal prediction using air pollution data by several space-time modelling approaches. Firstly, we implement the dynamic linear modelling (DLM) approach in spatial interpolation and find various potential
problems with that approach. We develop software to implement our approach. Secondly, we implement a Bayesian spatial prediction (BSP) approach to model spatio-temporal ground-level ozone fields and compare the accuracy of that approach with that of the DLM. Thirdly, we develop a Bayesian version empirical orthogonal function (EOF) method to incorporate the uncertainties due to temporally varying spatial process, and the spatial variations at broad- and fine-
scale. Finally, we extend the BSP into the DLM framework to develop a unified Bayesian spatio-temporal model for univariate and
multivariate responses. The result generalizes a number of current approaches in this field.
|
453 |
Dynamic Bayesian models for modelling environmental space-time fieldsDou, Yiping 05 1900 (has links)
This thesis addresses spatial interpolation and temporal prediction using air pollution data by several space-time modelling approaches. Firstly, we implement the dynamic linear modelling (DLM) approach in spatial interpolation and find various potential
problems with that approach. We develop software to implement our approach. Secondly, we implement a Bayesian spatial prediction (BSP) approach to model spatio-temporal ground-level ozone fields and compare the accuracy of that approach with that of the DLM. Thirdly, we develop a Bayesian version empirical orthogonal function (EOF) method to incorporate the uncertainties due to temporally varying spatial process, and the spatial variations at broad- and fine-
scale. Finally, we extend the BSP into the DLM framework to develop a unified Bayesian spatio-temporal model for univariate and
multivariate responses. The result generalizes a number of current approaches in this field. / Science, Faculty of / Statistics, Department of / Graduate
|
454 |
Applying Bayesian belief networks in Sun Tzu's Art of warAng, Kwang Chien 12 1900 (has links)
Approved for public release; distribution in unlimited. / The principles of Sun Tzu's Art of War have been widely used by business executives and military officers with much success in the realm of competition and conflict. However, when conflict situations arise in a highly stressful environment coupled with the pressure of time, decision makers may not be able to consider all the key concepts when forming their decisions or strategies. Therefore, a structured reasoning approach may be used to apply Sun Tzu's principles correctly and fully. Sun Tzu's principles are believed to be able to be modeled mathematically; hence, a Bayesian Network model (a form of mathematical tool using probability theory) is used to capture Sun Tzu's principles and provide the structured reasoning approach. Scholars have identified incompleteness in Sun Tzu's appreciation of information in war and his application of secret agents. This incompleteness resulted in circular reasoning when both sides of the conflict apply his principles. This circular reasoning can be resolved through the use of advanced probability theory. A Bayesian Network Model however, not only provides a structured reasoning approach, but more importantly, it can also resolve the circular reasoning problem that has been identified. / Captain, Singapore Army
|
455 |
Random finite sets in Multi-object filteringVo, Ba Tuong January 2008 (has links)
[Truncated abstract] The multi-object filtering problem is a logical and fundamental generalization of the ubiquitous single-object vector filtering problem. Multi-object filtering essentially concerns the joint detection and estimation of the unknown and time-varying number of objects present, and the dynamic state of each of these objects, given a sequence of observation sets. This problem is intrinsically challenging because, given an observation set, there is no knowledge of which object generated which measurement, if any, and the detected measurements are indistinguishable from false alarms. Multi-object filtering poses significant technical challenges, and is indeed an established area of research, with many applications in both military and commercial realms. The new and emerging approach to multi-object filtering is based on the formal theory of random finite sets, and is a natural, elegant and rigorous framework for the theory of multiobject filtering, originally proposed by Mahler. In contrast to traditional approaches, the random finite set framework is completely free of explicit data associations. The random finite set framework is adopted in this dissertation as the basis for a principled and comprehensive study of multi-object filtering. The premise of this framework is that the collection of object states and measurements at any time are treated namely as random finite sets. A random finite set is simply a finite-set-valued random variable, i.e. a random variable which is random in both the number of elements and the values of the elements themselves. Consequently, formulating the multiobject filtering problem using random finite set models precisely encapsulates the essence of the multi-object filtering problem, and enables the development of principled solutions therein. '...' The performance of the proposed algorithm is demonstrated in simulated scenarios, and shown at least in simulation to dramatically outperform traditional single-object filtering in clutter approaches. The second key contribution is a mathematically principled derivation and practical implementation of a novel algorithm for multi-object Bayesian filtering, based on moment approximations to the posterior density of the random finite set state. The performance of the proposed algorithm is also demonstrated in practical scenarios, and shown to considerably outperform traditional multi-object filtering approaches. The third key contribution is a mathematically principled derivation and practical implementation of a novel algorithm for multi-object Bayesian filtering, based on functional approximations to the posterior density of the random finite set state. The performance of the proposed algorithm is compared with the previous, and shown to appreciably outperform the previous in certain classes of situations. The final key contribution is the definition of a consistent and efficiently computable metric for multi-object performance evaluation. It is shown that the finite set theoretic state space formulation permits a mathematically rigorous and physically intuitive construct for measuring the estimation error of a multi-object filter, in the form of a metric. This metric is used to evaluate and compare the multi-object filtering algorithms developed in this dissertation.
|
456 |
Bayesian Model Selection for High-dimensional High-throughput DataJoshi, Adarsh 2010 May 1900 (has links)
Bayesian methods are often criticized on the grounds of subjectivity. Furthermore, misspecified
priors can have a deleterious effect on Bayesian inference. Noting that model
selection is effectively a test of many hypotheses, Dr. Valen E. Johnson sought to eliminate
the need of prior specification by computing Bayes' factors from frequentist test statistics.
In his pioneering work that was published in the year 2005, Dr. Johnson proposed
using so-called local priors for computing Bayes? factors from test statistics. Dr. Johnson
and Dr. Jianhua Hu used Bayes' factors for model selection in a linear model setting. In
an independent work, Dr. Johnson and another colleage, David Rossell, investigated two
families of non-local priors for testing the regression parameter in a linear model setting.
These non-local priors enable greater separation between the theories of null and alternative
hypotheses.
In this dissertation, I extend model selection based on Bayes' factors and use nonlocal
priors to define Bayes' factors based on test statistics. With these priors, I have been
able to reduce the problem of prior specification to setting to just one scaling parameter.
That scaling parameter can be easily set, for example, on the basis of frequentist operating
characteristics of the corresponding Bayes' factors. Furthermore, the loss of information by basing a Bayes' factors on a test statistic is minimal.
Along with Dr. Johnson and Dr. Hu, I used the Bayes' factors based on the likelihood
ratio statistic to develop a method for clustering gene expression data. This method has
performed well in both simulated examples and real datasets. An outline of that work is
also included in this dissertation. Further, I extend the clustering model to a subclass of
the decomposable graphical model class, which is more appropriate for genotype data sets,
such as single-nucleotide polymorphism (SNP) data. Efficient FORTRAN programming has
enabled me to apply the methodology to hundreds of nodes.
For problems that produce computationally harder probability landscapes, I propose a
modification of the Markov chain Monte Carlo algorithm to extract information regarding
the important network structures in the data. This modified algorithm performs well in
inferring complex network structures. I use this method to develop a prediction model for
disease based on SNP data. My method performs well in cross-validation studies.
|
457 |
Inference on Markov random fields : methods and applicationsLienart, Thibaut January 2017 (has links)
This thesis considers the problem of performing inference on undirected graphical models with continuous state spaces. These models represent conditional independence structures that can appear in the context of Bayesian Machine Learning. In the thesis, we focus on computational methods and applications. The aim of the thesis is to demonstrate that the factorisation structure corresponding to the conditional independence structure present in high-dimensional models can be exploited to decrease the computational complexity of inference algorithms. First, we consider the smoothing problem on Hidden Markov Models (HMMs) and discuss novel algorithms that have sub-quadratic computational complexity in the number of particles used. We show they perform on par with existing state-of-the-art algorithms with a quadratic complexity. Further, a novel class of rejection free samplers for graphical models known as the Local Bouncy Particle Sampler (LBPS) is explored and applied on a very large instance of the Probabilistic Matrix Factorisation (PMF) problem. We show the method performs slightly better than Hamiltonian Monte Carlo methods (HMC). It is also the first such practical application of the method to a statistical model with hundreds of thousands of dimensions. In a second part of the thesis, we consider approximate Bayesian inference methods and in particular the Expectation Propagation (EP) algorithm. We show it can be applied as the backbone of a novel distributed Bayesian inference mechanism. Further, we discuss novel variants of the EP algorithms and show that a specific type of update mechanism, analogous to the mirror descent algorithm outperforms all existing variants and is robust to Monte Carlo noise. Lastly, we show that EP can be used to help the Particle Belief Propagation (PBP) algorithm in order to form cheap and adaptive proposals and significantly outperform classical PBP.
|
458 |
Bayesian Framework for Sparse Vector Recovery and Parameter Bounds with Application to Compressive SensingJanuary 2019 (has links)
abstract: Signal compressed using classical compression methods can be acquired using brute force (i.e. searching for non-zero entries in component-wise). However, sparse solutions require combinatorial searches of high computations. In this thesis, instead, two Bayesian approaches are considered to recover a sparse vector from underdetermined noisy measurements. The first is constructed using a Bernoulli-Gaussian (BG) prior distribution and is assumed to be the true generative model. The second is constructed using a Gamma-Normal (GN) prior distribution and is, therefore, a different (i.e. misspecified) model. To estimate the posterior distribution for the correctly specified scenario, an algorithm based on generalized approximated message passing (GAMP) is constructed, while an algorithm based on sparse Bayesian learning (SBL) is used for the misspecified scenario. Recovering sparse signal using Bayesian framework is one class of algorithms to solve the sparse problem. All classes of algorithms aim to get around the high computations associated with the combinatorial searches. Compressive sensing (CS) is a widely-used terminology attributed to optimize the sparse problem and its applications. Applications such as magnetic resonance imaging (MRI), image acquisition in radar imaging, and facial recognition. In CS literature, the target vector can be recovered either by optimizing an objective function using point estimation, or recovering a distribution of the sparse vector using Bayesian estimation. Although Bayesian framework provides an extra degree of freedom to assume a distribution that is directly applicable to the problem of interest, it is hard to find a theoretical guarantee of convergence. This limitation has shifted some of researches to use a non-Bayesian framework. This thesis tries to close this gab by proposing a Bayesian framework with a suggested theoretical bound for the assumed, not necessarily correct, distribution. In the simulation study, a general lower Bayesian Cram\'er-Rao bound (BCRB) bound is extracted along with misspecified Bayesian Cram\'er-Rao bound (MBCRB) for GN model. Both bounds are validated using mean square error (MSE) performances of the aforementioned algorithms. Also, a quantification of the performance in terms of gains versus losses is introduced as one main finding of this report. / Dissertation/Thesis / Masters Thesis Computer Engineering 2019
|
459 |
Taktikröstning i kommunala val : En studie om strategiskt väljarbeteende utifrån rational choice-teorinOskarsson, Christian January 2016 (has links)
Denna kandidatuppsats bemöter frågan om väljarbeteende i allmänna val; huruvida strategiskt röstande (taktikröstning) är ett förekommande fenomen i val till svenska kommunfullmäktige eller ej och i så fall vilka faktorer som ligger bakom strategiskt röstande. En underförstådd tes bakom väljarbeteende är att röstberättigade röstar i enlighet med deras partipreferens utifrån en rad underliggande orsaker, såsom sakpolitik, partifärg, ideologi och organisationsstruktur. Dock har viss forskning uppstått som tyder på att somliga väljare agerar konsekvent och röstar utifrån bästa möjliga utdelning (payoff), snarare än direkta skäl. Dessa indirekta skäl kan röra sig om partiernas valallianser med övriga partier, något som alltid inte uppskattas av väljarna. Under senare halvan av 1900-talet har studier kring väljarbeteende uppmärksammats av statsvetare och beteendevetare. En av de mest omnämnda publikationerna inom vetenskapen är undertecknad den amerikanska ekonomen Anthony Downs som genom sin bok An Economic Theory of Democracy (1957) har undersökt relationen mellan politiska kandidater och väljare. Uppsatsen kommer presentera för läsaren tidigare studier inom detta specifika forskningsområde samt en nutidshistorisk överblick i s.k. oheliga allianser. Resultatet kommer visa på partisamverkans tydliggjorda betydelse i hur kommunmedborgarna röstar i allmänna val.
|
460 |
LOF of logistic GEE models and cost efficient Bayesian optimal designs for nonlinear combinations of parameters in nonlinear regression modelsTang, Zhongwen January 1900 (has links)
Doctor of Philosophy / Department of Statistics / Shie-Shien Yang / When the primary research interest is in the marginal dependence between the response and the covariates, logistic GEE (Generalized Estimating Equation) models are often used to analyze clustered binary data. Relative to ordinary logistic regression, very little work has been done to assess the lack of fit of a logistic GEE model. A new method addressing the LOF of a logistic GEE model was proposed. Simulation results indicate the proposed method performs better than or as well as other currently available LOF methods for logistic GEE models. A SAS macro was developed to implement the proposed method.
Nonlinear regression models are widely used in medical science. Before the models can be fit and parameters interpreted, researchers need to decide which design points in a prespecified design space should be included in the experiment. Careful choices at this stage will lead to efficient usage of limited resources. We proposed a cost efficient Bayesian optimal design method for nonlinear combinations of parameters in a nonlinear model with quantitative predictors. An R package was developed to implement the proposed method.
|
Page generated in 0.0282 seconds