• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 170
  • 17
  • 15
  • 11
  • 10
  • 8
  • 8
  • 3
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 333
  • 333
  • 333
  • 333
  • 145
  • 78
  • 73
  • 54
  • 47
  • 46
  • 43
  • 42
  • 42
  • 31
  • 29
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
151

Essays on econometric modeling of subjective perceptions of risks in environment and human health

Nguyen, To Ngoc 15 May 2009 (has links)
A large body of literature studies the issues of the option price and other ex-ante welfare measures under the microeconomic theory to valuate reductions of risks inherent in environment and human health. However, it does not offer a careful discussion of how to estimate risk reduction values using data, especially the modeling and estimating individual perceptions of risks present in the econometric models. The central theme of my dissertation is the approaches taken for the empirical estimation of probabilistic risks under alternative assumptions about individual perceptions of risk involved: the objective probability, the Savage subjective probability, and the subjective distributions of probability. Each of these three types of risk specifications is covered in one of the three essays. The first essay addresses the problem of empirical estimation of individual willingness to pay for recreation access to public land under uncertainty. In this essay I developed an econometric model and applied it to the case of lottery-rationed hunting permits. The empirical result finds that the model correctly predicts the responses of 84% of the respondents in the Maine moose hunting survey. The second essay addresses the estimation of a logit model for individual binary choices that involve heterogeneity in subjective probabilities. For this problem, I introduce the use of the hierarchical Bayes to estimate, among others, the parameters of distribution of subjective probabilities. The Monte Carlo study finds the estimator asymptotically unbiased and efficient. The third essay addresses the problem of modeling perceived mortality risks from arsenic concentrations in drinking water. I estimated a formal model that allows for ambiguity about risk. The empirical findings revealed that perceived risk was positively associated with exposure levels and also related individuating factors, in particular smoking habits and one’s current health status. Further evidence was found that the variance of the perceived risk distribution is non-zero. In all, the three essays contribute methodological approaches and provide empirical examples for developing empirical models and estimating value of risk reductions in environment and human health, given the assumption about the individual’s perceptions of risk, and accordingly, the reasonable specifications of risks involved in the models.
152

History matching and uncertainty quantificiation using sampling method

Ma, Xianlin 15 May 2009 (has links)
Uncertainty quantification involves sampling the reservoir parameters correctly from a posterior probability function that is conditioned to both static and dynamic data. Rigorous sampling methods like Markov Chain Monte Carlo (MCMC) are known to sample from the distribution but can be computationally prohibitive for high resolution reservoir models. Approximate sampling methods are more efficient but less rigorous for nonlinear inverse problems. There is a need for an efficient and rigorous approach to uncertainty quantification for the nonlinear inverse problems. First, we propose a two-stage MCMC approach using sensitivities for quantifying uncertainty in history matching geological models. In the first stage, we compute the acceptance probability for a proposed change in reservoir parameters based on a linearized approximation to flow simulation in a small neighborhood of the previously computed dynamic data. In the second stage, those proposals that passed a selected criterion of the first stage are assessed by running full flow simulations to assure the rigorousness. Second, we propose a two-stage MCMC approach using response surface models for quantifying uncertainty. The formulation allows us to history match three-phase flow simultaneously. The built response exists independently of expensive flow simulation, and provides efficient samples for the reservoir simulation and MCMC in the second stage. Third, we propose a two-stage MCMC approach using upscaling and non-parametric regressions for quantifying uncertainty. A coarse grid model acts as a surrogate for the fine grid model by flow-based upscaling. The response correction of the coarse-scale model is performed by error modeling via the non-parametric regression to approximate the response of the computationally expensive fine-scale model. Our proposed two-stage sampling approaches are computationally efficient and rigorous with a significantly higher acceptance rate compared to traditional MCMC algorithms. Finally, we developed a coarsening algorithm to determine an optimal reservoir simulation grid by grouping fine scale layers in such a way that the heterogeneity measure of a defined static property is minimized within the layers. The optimal number of layers is then selected based on a statistical analysis. The power and utility of our approaches have been demonstrated using both synthetic and field examples.
153

Population SAMC, ChIP-chip Data Analysis and Beyond

Wu, Mingqi 2010 December 1900 (has links)
This dissertation research consists of two topics, population stochastics approximation Monte Carlo (Pop-SAMC) for Baysian model selection problems and ChIP-chip data analysis. The following two paragraphs give a brief introduction to each of the two topics, respectively. Although the reversible jump MCMC (RJMCMC) has the ability to traverse the space of possible models in Bayesian model selection problems, it is prone to becoming trapped into local mode, when the model space is complex. SAMC, proposed by Liang, Liu and Carroll, essentially overcomes the difficulty in dimension-jumping moves, by introducing a self-adjusting mechanism. However, this learning mechanism has not yet reached its maximum efficiency. In this dissertation, we propose a Pop-SAMC algorithm; it works on population chains of SAMC, which can provide a more efficient self-adjusting mechanism and make use of crossover operator from genetic algorithms to further increase its efficiency. Under mild conditions, the convergence of this algorithm is proved. The effectiveness of Pop-SAMC in Bayesian model selection problems is examined through a change-point identification example and a large-p linear regression variable selection example. The numerical results indicate that Pop- SAMC outperforms both the single chain SAMC and RJMCMC significantly. In the ChIP-chip data analysis study, we developed two methodologies to identify the transcription factor binding sites: Bayesian latent model and population-based test. The former models the neighboring dependence of probes by introducing a latent indicator vector; The later provides a nonparametric method for evaluation of test scores in a multiple hypothesis test by making use of population information of samples. Both methods are applied to real and simulated datasets. The numerical results indicate the Bayesian latent model can outperform the existing methods, especially when the data contain outliers, and the use of population information can significantly improve the power of multiple hypothesis tests.
154

Thermo-Hydrological-Mechanical Analysis of a Clay Barrier for Radioactive Waste Isolation: Probabilistic Calibration and Advanced Modeling

Dontha, Lakshman 2012 May 1900 (has links)
The engineered barrier system is a basic element in the design of repository to isolate high level radioactive waste (HLW). In this system, the clay barrier plays a prominent role in dispersing the heat generated from the waste, reduce the flow of pore water from the host rock, and maintaining the structural stability of the waste canister. The compacted expansive clay (generally bentonite blocks) is initially in unsaturated state. During the life time of the repository, the barrier will undergo different coupled thermal, hydrological and mechanical (THM) phenomena due to heating (from the heat-emitting nuclear waste) and hydration (from the saturated host rock). The design of nuclear waste disposal requires the prediction of the long term barrier behavior (i.e. hundred or thousand years), so numerical modeling is a basic component of the repository design. The numerical analyses are performed using mathematical THM formulation and the associated numerical code. Constitutive models are an essential part of the numerical simulations. Those constitutive models represent the intrinsic behavior of the material for the individual physical phenomenon (i.e. thermal, hydraulic and mechanical). Deterministic analyses have shown the potential of such mathematical formulations to describe the physical behavior of the engineered barrier system. However, the effect of the inherent uncertainties associated with the different constitutive models on the global behavior of the isolation system has not been explored yet. The first part of this thesis is related to application of recent probabilistic methods to understand and assess the impact of uncertainties on the global THM model response. Experimental data associated with the FEBEX project has been adopted for the case study presented in this thesis. CODE_BRIGHT, a fully coupled THM finite element program, is used to perform the numerical THM analysis. The second part of this thesis focuses on the complex mechanical behavior observed in a barrier material subjected (during 5 years) to heating and hydration under actual repository conditions The studied experiment is the (ongoing) full scale in-situ FEBEX test at Grimsel test site, Switzerland. A partial dismantling of this experiment has allowed the inspection of the barrier material subjected to varying stresses due to hydration and heating. The clay underwent both elastic and plastic volumetric deformations at different suction and temperature levels with changes in the pre-consolidation pressure and voids ratio that are difficult to explain with conventional models. In this thesis a double structure elasto plastic model is proposed to study the mechanical behavior of this barrier material. The numerical modeling was performed with CODE_BRIGHT. The study shows that the double structure model explains satisfactorily the observed changes in the mechanical behavior of the clay material.
155

Bayesian Inference in Large Data Problems

Quiroz, Matias January 2015 (has links)
In the last decade or so, there has been a dramatic increase in storage facilities and the possibility of processing huge amounts of data. This has made large high-quality data sets widely accessible for practitioners. This technology innovation seriously challenges traditional modeling and inference methodology. This thesis is devoted to developing inference and modeling tools to handle large data sets. Four included papers treat various important aspects of this topic, with a special emphasis on Bayesian inference by scalable Markov Chain Monte Carlo (MCMC) methods. In the first paper, we propose a novel mixture-of-experts model for longitudinal data. The model and inference methodology allows for manageable computations with a large number of subjects. The model dramatically improves the out-of-sample predictive density forecasts compared to existing models. The second paper aims at developing a scalable MCMC algorithm. Ideas from the survey sampling literature are used to estimate the likelihood on a random subset of data. The likelihood estimate is used within the pseudomarginal MCMC framework and we develop a theoretical framework for such algorithms based on subsets of the data. The third paper further develops the ideas introduced in the second paper. We introduce the difference estimator in this framework and modify the methods for estimating the likelihood on a random subset of data. This results in scalable inference for a wider class of models. Finally, the fourth paper brings the survey sampling tools for estimating the likelihood developed in the thesis into the delayed acceptance MCMC framework. We compare to an existing approach in the literature and document promising results for our algorithm. / <p>At the time of the doctoral defense, the following papers were unpublished and had a status as follows: Paper 1: Submitted. Paper 2: Submitted. Paper 3: Manuscript. Paper 4: Manuscript.</p>
156

Particle filters and Markov chains for learning of dynamical systems

Lindsten, Fredrik January 2013 (has links)
Sequential Monte Carlo (SMC) and Markov chain Monte Carlo (MCMC) methods provide computational tools for systematic inference and learning in complex dynamical systems, such as nonlinear and non-Gaussian state-space models. This thesis builds upon several methodological advances within these classes of Monte Carlo methods.Particular emphasis is placed on the combination of SMC and MCMC in so called particle MCMC algorithms. These algorithms rely on SMC for generating samples from the often highly autocorrelated state-trajectory. A specific particle MCMC algorithm, referred to as particle Gibbs with ancestor sampling (PGAS), is suggested. By making use of backward sampling ideas, albeit implemented in a forward-only fashion, PGAS enjoys good mixing even when using seemingly few particles in the underlying SMC sampler. This results in a computationally competitive particle MCMC algorithm. As illustrated in this thesis, PGAS is a useful tool for both Bayesian and frequentistic parameter inference as well as for state smoothing. The PGAS sampler is successfully applied to the classical problem of Wiener system identification, and it is also used for inference in the challenging class of non-Markovian latent variable models.Many nonlinear models encountered in practice contain some tractable substructure. As a second problem considered in this thesis, we develop Monte Carlo methods capable of exploiting such substructures to obtain more accurate estimators than what is provided otherwise. For the filtering problem, this can be done by using the well known Rao-Blackwellized particle filter (RBPF). The RBPF is analysed in terms of asymptotic variance, resulting in an expression for the performance gain offered by Rao-Blackwellization. Furthermore, a Rao-Blackwellized particle smoother is derived, capable of addressing the smoothing problem in so called mixed linear/nonlinear state-space models. The idea of Rao-Blackwellization is also used to develop an online algorithm for Bayesian parameter inference in nonlinear state-space models with affine parameter dependencies. / CNDM / CADICS
157

Statistical Inference for Models with Intractable Normalizing Constants

Jin, Ick Hoon 16 December 2013 (has links)
In this dissertation, we have proposed two new algorithms for statistical inference for models with intractable normalizing constants: the Monte Carlo Metropolis-Hastings algorithm and the Bayesian Stochastic Approximation Monte Carlo algorithm. The MCMH algorithm is a Monte Carlo version of the Metropolis-Hastings algorithm. At each iteration, it replaces the unknown normalizing constant ratio by a Monte Carlo estimate. Although the algorithm violates the detailed balance condition, it still converges, as shown in the paper, to the desired target distribution under mild conditions. The BSAMC algorithm works by simulating from a sequence of approximated distributions using the SAMC algorithm. A strong law of large numbers has been established for BSAMC estimators under mild conditions. One significant advantage of our algorithms over the auxiliary variable MCMC methods is that they avoid the requirement for perfect samples, and thus it can be applied to many models for which perfect sampling is not available or very expensive. In addition, although the normalizing constant approximation is also involved in BSAMC, BSAMC can perform very robustly to initial guesses of parameters due to the powerful ability of SAMC in sample space exploration. BSAMC has also provided a general framework for approximated Bayesian inference for the models for which the likelihood function is intractable: sampling from a sequence of approximated distributions with their average converging to the target distribution. With these two illustrated algorithms, we have demonstrated how the SAMCMC method can be applied to estimate the parameters of ERGMs, which is one of the typical examples of statistical models with intractable normalizing constants. We showed that the resulting estimate is consistent, asymptotically normal and asymptotically efficient. Compared to the MCMLE and SSA methods, a significant advantage of SAMCMC is that it overcomes the model degeneracy problem. The strength of SAMCMC comes from its varying truncation mechanism, which enables SAMCMC to avoid the model degeneracy problem through re-initialization. MCMLE and SSA do not possess the re-initialization mechanism, and tend to converge to a solution near the starting point, so they often fail for the models which suffer from the model degeneracy problem.
158

Automatic Markov Chain Monte Carlo Procedures for Sampling from Multivariate Distributions

Karawatzki, Roman, Leydold, Josef, Pötzelberger, Klaus January 2005 (has links) (PDF)
Generating samples from multivariate distributions efficiently is an important task in Monte Carlo integration and many other stochastic simulation problems. Markov chain Monte Carlo has been shown to be very efficient compared to "conventional methods", especially when many dimensions are involved. In this article we propose a Hit-and-Run sampler in combination with the Ratio-of-Uniforms method. We show that it is well suited for an algorithm to generate points from quite arbitrary distributions, which include all log-concave distributions. The algorithm works automatically in the sense that only the mode (or an approximation of it) and an oracle is required, i.e., a subroutine that returns the value of the density function at any point x. We show that the number of evaluations of the density increases slowly with dimension. An implementation of these algorithms in C is available from the <a href="http://statmath.wu-wien.ac.at/software/hitro/">authors</a>. (author's abstract) / Series: Research Report Series / Department of Statistics and Mathematics
159

Bayesian Inference on Mixed-effects Models with Skewed Distributions for HIV longitudinal Data

Chen, Ren 01 January 2012 (has links)
Statistical models have greatly improved our understanding of the pathogenesis of HIV-1 infection and guided for the treatment of AIDS patients and evaluation of antiretroviral (ARV) therapies. Although various statistical modeling and analysis methods have been applied for estimating the parameters of HIV dynamics via mixed-effects models, a common assumption of distribution is normal for random errors and random-effects. This assumption may lack the robustness against departures from normality so may lead misleading or biased inference. Moreover, some covariates such as CD4 cell count may be often measured with substantial errors. Bivariate clustered (correlated) data are also commonly encountered in HIV dynamic studies, in which the data set particularly exhibits skewness and heavy tails. In the literature, there has been considerable interest in, via tangible computation methods, comparing different proposed models related to HIV dynamics, accommodating skewness (in univariate) and covariate measurement errors, or considering skewness in multivariate outcomes observed in longitudinal studies. However, there have been limited studies that address these issues simultaneously. One way to incorporate skewness is to use a more general distribution family that can provide flexibility in distributional assumptions of random-effects and model random errors to produce robust parameter estimates. In this research, we developed Bayesian hierarchical models in which the skewness was incorporated by using skew-elliptical (SE) distribution and all of the inferences were carried out through Bayesian approach via Markov chain Monte Carlo (MCMC). Two real data set from HIV/AIDS clinical trial were used to illustrate the proposed models and methods. This dissertation explored three topics. First, with an SE distribution assumption, we compared models with different time-varying viral decay rate functions. The effect of skewness on the model fitting was also evaluated. The associations between the estimated decay rates based on the best fitted model and clinical related variables such as baseline HIV viral load, CD4 cell count and longterm response status were also evaluated. Second, by jointly modeling via a Bayesian approach, we simultaneously addressed the issues of outcome with skewness and a covariate process with measurement errors. We also investigated how estimated parameters were changed under linear, nonlinear and semiparametric mixed-effects models. Third, in order to accommodate individual clustering within subjects as well as the correlation between bivariate measurements such as CD4 and CD8 cell count measured during the ARV therapies, bivariate linear mixed-effects models with skewed distributions were investigated. Extended underlying normality assumption with SE distribution assumption was proposed. The impacts of different distributions in SE family on the model fit were also evaluated and compared. Real data sets from AIDS clinical trial studies were used to illustrate the proposed methodologies based on the three topics and compare various potential models with different distribution specifications. The results may be important for HIV/AIDS studies in providing guidance to better understand the virologic responses to antiretroviral treatment. Although this research is motivated by HIV/AIDS studies, the basic concepts of the methods developed here can have generally broader applications in other fields as long as the relevant technical specifications are met. In addition, the proposed methods can be easily implemented by using the publicly available WinBUGS package, and this makes our approach quite accessible to practicing statisticians in the fields.
160

Bayesian hierarchical models for spatial count data with application to fire frequency in British Columbia

Li, Hong 16 December 2008 (has links)
This thesis develops hierarchical spatial models for the analysis of correlated and overdispersed count data based on the negative binomial distribution. Model development is motivated by a large scale study of fire frequency in British Columbia, conducted by the Pacific Forestry Service. Specific to our analysis, the main focus lies in examining the interaction between wildfire and forest insect outbreaks. In particular, we wish to relate the frequency of wildfire to the severity of mountain pine beetle (MPB) outbreaks in the province. There is a widespread belief that forest insect outbreaks lead to an increased frequency of wildfires; however, empirical evidence to date has been limited and thus a greater understanding of the association is required. This is critically important as British Columbia is currently experiencing a historically unprecedented MPB outbreak. We specify regression models for fire frequency incorporating random effects in a generalized linear mixed modeling framework. Within such a framework, both spatial correlation and extra-Poisson variation can be accommodated through random effects that are incorporated into the linear predictor of a generalized linear model. We consider a range of models, and conduct model selection and inference within the Bayesian framework with implementation based on Markov Chain Monte Carlo.

Page generated in 0.0804 seconds