• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 171
  • 17
  • 15
  • 11
  • 10
  • 8
  • 8
  • 3
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 334
  • 334
  • 334
  • 334
  • 146
  • 79
  • 73
  • 54
  • 47
  • 46
  • 44
  • 42
  • 42
  • 31
  • 29
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
221

Additive Latent Variable (ALV) Modeling: Assessing Variation in Intervention Impact in Randomized Field Trials

Toyinbo, Peter Ayo 23 October 2009 (has links)
In order to personalize or tailor treatments to maximize impact among different subgroups, there is need to model not only the main effects of intervention but also the variation in intervention impact by baseline individual level risk characteristics. To this end a suitable statistical model will allow researchers to answer a major research question: who benefits or is harmed by this intervention program? Commonly in social and psychological research, the baseline risk may be unobservable and have to be estimated from observed indicators that are measured with errors; also it may have nonlinear relationship with the outcome. Most of the existing nonlinear structural equation models (SEM’s) developed to address such problems employ polynomial or fully parametric nonlinear functions to define the structural equations. These methods are limited because they require functional forms to be specified beforehand and even if the models include higher order polynomials there may be problems when the focus of interest relates to the function over its whole domain. To develop a more flexible statistical modeling technique for assessing complex relationships between a proximal/distal outcome and 1) baseline characteristics measured with errors, and 2) baseline-treatment interaction; such that the shapes of these relationships are data driven and there is no need for the shapes to be determined a priori. In the ALV model structure the nonlinear components of the regression equations are represented as generalized additive model (GAM), or generalized additive mixed-effects model (GAMM). Replication study results show that the ALV model estimates of underlying relationships in the data are sufficiently close to the true pattern. The ALV modeling technique allows researchers to assess how an intervention affects individuals differently as a function of baseline risk that is itself measured with error, and uncover complex relationships in the data that might otherwise be missed. Although the ALV approach is computationally intensive, it relieves its users from the need to decide functional forms before the model is run. It can be extended to examine complex nonlinearity between growth factors and distal outcomes in a longitudinal study.
222

A computational framework for the solution of infinite-dimensional Bayesian statistical inverse problems with application to global seismic inversion

Martin, James Robert, Ph. D. 18 September 2015 (has links)
Quantifying uncertainties in large-scale forward and inverse PDE simulations has emerged as a central challenge facing the field of computational science and engineering. The promise of modeling and simulation for prediction, design, and control cannot be fully realized unless uncertainties in models are rigorously quantified, since this uncertainty can potentially overwhelm the computed result. While statistical inverse problems can be solved today for smaller models with a handful of uncertain parameters, this task is computationally intractable using contemporary algorithms for complex systems characterized by large-scale simulations and high-dimensional parameter spaces. In this dissertation, I address issues regarding the theoretical formulation, numerical approximation, and algorithms for solution of infinite-dimensional Bayesian statistical inverse problems, and apply the entire framework to a problem in global seismic wave propagation. Classical (deterministic) approaches to solving inverse problems attempt to recover the “best-fit” parameters that match given observation data, as measured in a particular metric. In the statistical inverse problem, we go one step further to return not only a point estimate of the best medium properties, but also a complete statistical description of the uncertain parameters. The result is a posterior probability distribution that describes our state of knowledge after learning from the available data, and provides a complete description of parameter uncertainty. In this dissertation, a computational framework for such problems is described that wraps around the existing forward solvers, as long as they are appropriately equipped, for a given physical problem. Then a collection of tools, insights and numerical methods may be applied to solve the problem, and interrogate the resulting posterior distribution, which describes our final state of knowledge. We demonstrate the framework with numerical examples, including inference of a heterogeneous compressional wavespeed field for a problem in global seismic wave propagation with 10⁶ parameters.
223

Assessing the Effect of Prior Distribution Assumption on the Variance Parameters in Evaluating Bioequivalence Trials

Ujamaa, Dawud A. 02 August 2006 (has links)
Bioequivalence determines if two drugs are alike. The three kinds of bioequivalence are Average, Population, and Individual Bioequivalence. These Bioequivalence criteria can be evaluated using aggregate and disaggregate methods. Considerable work assessing bioequivalence in a frequentist method exists, but the advantages of Bayesian methods for Bioequivalence have been recently explored. Variance parameters are essential to any of theses existing Bayesian Bioequivalence metrics. Usually, the prior distributions for model parameters use either informative priors or vague priors. The Bioequivalence inference may be sensitive to the prior distribution on the variances. Recently, there have been questions about the routine use of inverse gamma priors for variance parameters. In this paper we examine the effect that changing the prior distribution of the variance parameters has on Bayesian models for assessing Bioequivalence and the carry-over effect. We explore our method with some real data sets from the FDA.
224

Parameter Estimation for Nonlinear State Space Models

Wong, Jessica 23 April 2012 (has links)
This thesis explores the methodology of state, and in particular, parameter estimation for time series datasets. Various approaches are investigated that are suitable for nonlinear models and non-Gaussian observations using state space models. The methodologies are applied to a dataset consisting of the historical lynx and hare populations, typically modeled by the Lotka- Volterra equations. With this model and the observed dataset, particle filtering and parameter estimation methods are implemented as a way to better predict the state of the system. Methods for parameter estimation considered include: maximum likelihood estimation, state augmented particle filtering, multiple iterative filtering and particle Markov chain Monte Carlo (PMCMC) methods. The specific advantages and disadvantages for each technique are discussed. However, in most cases, PMCMC is the preferred parameter estimation solution. It has the advantage over other approaches in that it can well approximate any posterior distribution from which inference can be made. / Master's thesis
225

Bayesian assessment of reliability dynamics for age-dependent systems / Bajesinis patikimumo dinamikos vertinimas nuo amžiaus priklausančioms sistemoms

Iešmantas, Tomas 31 August 2011 (has links)
Age-dependent highly reliable systems provide small amount of statistical information and for that reason classical frequentist methods cannot be applied due to their asymptotical assumptions. However, Bayesian methods, due to their ability to naturally couple all sources of information (including expert subjective opinions) and not rely on asymptotic assumptions, are attractive approach to solve small sample problems in age-dependent reliability modelling. In this thesis Bayesian paradigm and its applicability were presented and general methodology to analyse previously mentioned problem was obtained. Methodology successfully was applied for two real data samples: failures in European natural gas grid and electrical Instrumentation and Control components. It was concluded that presented approach is able to easily investigate small samples in nonlinear age-dependent models. Also, analysis showed that different model goodness-of-fit approaches can provide different inferences and that sometimes it can fail due to nonlinearities and heteroscedasticity present in data. For that reason Bayesian posterior model averaging procedure were applied and concluded that it gives more reliable and better calibrated results than would be in one model case. Also adaptive Metropolis superiority over classical Metropolis – Hastings algorithm for highly correlated parameters and nonlinearities in model was validated. / Nuo amžiaus priklausomos aukšto patikimumo sistemos suteikia labai mažai statistinių duomenų ir dėl šios priežasties klasikiniai dažninistiniai metodai negali būti taikomi, nes jie remiasi asimptotinėmis prielaidomis. Tačiau Bajesiniai metodai, dėl jų savybės natūraliai sujungti visus informacijos resursus (įskaitant ir subjektyvią ekspertinę nuomonę) ir nesiremti asimptotinėmis prielaidomis yra patraukli paradigma, tinkama spręsti mažų imčių problemas nuo amžiaus priklausomo patikimumo modeliavime. Šiame darbe pristatomas Bajesinis požiūris ir jo pritaikymas bei buvo sudaryta bendra metodologija, kaip analizuoti anksčiau minėtą problemą. Metodologija buvo sėkmingai pritaikyta dviems realioms sistemoms analizuoti: gedimai Europos gamtinių dujų tinkluose bei elektriniuose kontrolės komponentuose, naudojamuose atominėse elektrinėse. Padaryta išvada, kad sudaryta metodologija yra tinkama analizuoti mažoms imtims netiesiniuose nuo laiko priklausomuose modeliuose. Be to, analizė parodė, kad skirtingi modelio tikimo metodai gali vesti į skirtingas išvadas apie modelius ir kartais šie metodai gali neveikti dėl heteroskedastiškumo duomenyse. Dėl šios priežasties atlikta Bajesinio posteriorinio vidurkinimo procedūra, kurios rezultatas yra patikimesnis ir geriau kalibruotas, nei kad vieno modelio atveju. Taip pat nustatytas adaptyvaus Metropolis algoritmo pranašumas lyginant su Metropolis-Hastings algoritmu analizuojant labai koreliuotus parametrus bei netiesinius modelius.
226

Bayesian Spatial Modeling of Complex and High Dimensional Data

Konomi, Bledar 2011 December 1900 (has links)
The main objective of this dissertation is to apply Bayesian modeling to different complex and high-dimensional spatial data sets. I develop Bayesian hierarchical spatial models for both the observed location and the observation variable. Throughout this dissertation I execute the inference of the posterior distributions using Markov chain Monte Carlo by developing computational strategies that can reduce the computational cost. I start with a "high level" image analysis by modeling the pixels with a Gaussian process and the objects with a marked-point process. The proposed method is an automatic image segmentation and classification procedure which simultaneously detects the boundaries and classifies the objects in the image into one of the predetermined shape families. Next, I move my attention to the piecewise non-stationary Gaussian process models and their computational challenges for very large data sets. I simultaneously model the non-stationarity and reduce the computational cost by using the innovative technique of full-scale approximation. I successfully demonstrate the proposed reduction technique to the Total Ozone Matrix Spectrometer (TOMS) data. Furthermore, I extend the reduction method for the non-stationary Gaussian process models to a dynamic partition of the space by using a modified Treed Gaussian Model. This modification is based on the use of a non-stationary function and the full-scale approximation. The proposed model can deal with piecewise non-stationary geostatistical data with unknown partitions. Finally, I apply the method to the TOMS data to explore the non-stationary nature of the data.
227

Improved Perfect Slice Sampling

Hörmann, Wolfgang, Leydold, Josef January 2003 (has links) (PDF)
Perfect slice sampling is a method to turn Markov Chain Monte Carlo (MCMC) samplers into exact generators for independent random variates. The originally proposed method is rather slow and thus several improvements have been suggested. However, two of them are erroneous. In this article we give a short introduction to perfect slice sampling, point out incorrect methods, and give a new improved version of the original algorithm. (author's abstract) / Series: Preprint Series / Department of Applied Statistics and Data Processing
228

A Complete Framework for Modelling Workload Volatility of VoD System - a Perspective to Probabilistic Management

Roy, Shubhabrata 18 June 2014 (has links) (PDF)
There are some new challenges in system administration and design to optimize the resource management for a cloud based application. Some applications demand stringent performance requirements (e.g. delay and jitter bounds), while some applications exhibit bursty (volatile) workloads. This thesis proposes an epidemic model inspired (and continuous time Markov Chain based) framework, which can reproduce workload volatility namely the "buzz effects" (when there is a sudden increase of a content popularity) of a Video on Demand (VoD) system. Two estimation procedures (heuristic and a Markov Chain Monte Carlo (MCMC) based approach) have also been proposed in this work to calibrate the model against workload traces. Obtained model parameters from the calibration procedures reveal some interesting property of the model. Based on numerical simulations, precisions of both procedures have been analyzed, which show that both of them perform reasonably. However, the MCMC procedure outperforms the heuristic approach. This thesis also compares the proposed model with other existing models examining the goodness-of-fit of some statistical properties of real workload traces. Finally this work suggests a probabilistic resource provisioning approach based on a Large Deviation Principle (LDP). LDP statistically characterizes the buzz effects that causeextreme workload volatility. This analysis exploits the information obtained using the LDP of the VoD system for defining resource management policies. These policies may be of some interest to all stakeholders in the emerging context of cloud networking.
229

Bayesian Inference for Bivariate Conditional Copula Models with Continuous or Mixed Outcomes

Sabeti, Avideh 12 August 2013 (has links)
The main goal of this thesis is to develop Bayesian model for studying the influence of covariate on dependence between random variables. Conditional copula models are flexible tools for modelling complex dependence structures. We construct Bayesian inference for the conditional copula model adapted to regression settings in which the bivariate outcome is continuous or mixed (binary and continuous) and the copula parameter varies with covariate values. The functional relationship between the copula parameter and the covariate is modelled using cubic splines. We also extend our work to additive models which would allow us to handle more than one covariate while keeping the computational burden within reasonable limits. We perform the proposed joint Bayesian inference via adaptive Markov chain Monte Carlo sampling. The deviance information criterion and cross-validated marginal log-likelihood criterion are employed for three model selection problems: 1) choosing the copula family that best fits the data, 2) selecting the calibration function, i.e., checking if parametric form for copula parameter is suitable and 3) determining the number of independent variables in the additive model. The performance of the estimation and model selection techniques are investigated via simulations and demonstrated on two data sets: 1) Matched Multiple Birth and 2) Burn Injury. In which of interest is the influence of gestational age and maternal age on twin birth weights in the former data, whereas in the later data we are interested in investigating how patient’s age affects the severity of burn injury and the probability of death.
230

Bayesian Inference for Bivariate Conditional Copula Models with Continuous or Mixed Outcomes

Sabeti, Avideh 12 August 2013 (has links)
The main goal of this thesis is to develop Bayesian model for studying the influence of covariate on dependence between random variables. Conditional copula models are flexible tools for modelling complex dependence structures. We construct Bayesian inference for the conditional copula model adapted to regression settings in which the bivariate outcome is continuous or mixed (binary and continuous) and the copula parameter varies with covariate values. The functional relationship between the copula parameter and the covariate is modelled using cubic splines. We also extend our work to additive models which would allow us to handle more than one covariate while keeping the computational burden within reasonable limits. We perform the proposed joint Bayesian inference via adaptive Markov chain Monte Carlo sampling. The deviance information criterion and cross-validated marginal log-likelihood criterion are employed for three model selection problems: 1) choosing the copula family that best fits the data, 2) selecting the calibration function, i.e., checking if parametric form for copula parameter is suitable and 3) determining the number of independent variables in the additive model. The performance of the estimation and model selection techniques are investigated via simulations and demonstrated on two data sets: 1) Matched Multiple Birth and 2) Burn Injury. In which of interest is the influence of gestational age and maternal age on twin birth weights in the former data, whereas in the later data we are interested in investigating how patient’s age affects the severity of burn injury and the probability of death.

Page generated in 0.054 seconds