• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 152
  • 86
  • 54
  • 21
  • 10
  • 7
  • 4
  • 4
  • 4
  • 3
  • 2
  • 2
  • Tagged with
  • 412
  • 181
  • 87
  • 86
  • 78
  • 78
  • 77
  • 70
  • 65
  • 58
  • 57
  • 56
  • 48
  • 43
  • 42
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

Chinese Basic Pension Substitution Rate: A Monte Carlo Demonstration of the Individual Account Model

Dong, Bei, Zhang, Ling, Lu, Xuan January 2008 (has links)
At the end of 2005, the State Council of China passed ”The Decision on adjusting the Individual Account of Basic Pension System”, which adjusted the individual account in the 1997 basic pension system. In this essay, we will analyze the adjustment above, and use Life Annuity Actuarial Theory to establish the basic pension substitution rate model. Monte Carlo simulation is also used to prove the rationality of the model. Some suggestions are put forward associated with the substitution rate according to the current policy.
52

Uncertainty Analysis in Upscaling Well Log data By Markov Chain Monte Carlo Method

Hwang, Kyubum 16 January 2010 (has links)
More difficulties are now expected in exploring economically valuable reservoirs because most reservoirs have been already developed since beginning seismic exploration of the subsurface. In order to efficiently analyze heterogeneous fine-scale properties in subsurface layers, one ongoing challenge is accurately upscaling fine-scale (high frequency) logging measurements to coarse-scale data, such as surface seismic images. In addition, numerically efficient modeling cannot use models defined on the scale of log data. At this point, we need an upscaling method replaces the small scale data with simple large scale models. However, numerous unavoidable uncertainties still exist in the upscaling process, and these problems have been an important emphasis in geophysics for years. Regarding upscaling problems, there are predictable or unpredictable uncertainties in upscaling processes; such as, an averaging method, an upscaling algorithm, analysis of results, and so forth. To minimize the uncertainties, a Bayesian framework could be a useful tool for providing the posterior information to give a better estimate for a chosen model with a conditional probability. In addition, the likelihood of a Bayesian framework plays an important role in quantifying misfits between the measured data and the calculated parameters. Therefore, Bayesian methodology can provide a good solution for quantification of uncertainties in upscaling. When analyzing many uncertainties in porosities, wave velocities, densities, and thicknesses of rocks through upscaling well log data, the Markov Chain Monte Carlo (MCMC) method is a potentially beneficial tool that uses randomly generated parameters with a Bayesian framework producing the posterior information. In addition, the method provides reliable model parameters to estimate economic values of hydrocarbon reservoirs, even though log data include numerous unknown factors due to geological heterogeneity. In this thesis, fine layered well log data from the North Sea were selected with a depth range of 1600m to 1740m for upscaling using an MCMC implementation. The results allow us to automatically identify important depths where interfaces should be located, along with quantitative estimates of uncertainty in data. Specifically, interfaces in the example are required near depths of 1,650m, 1,695m, 1,710m, and 1,725m. Therefore, the number and location of blocked layers can be effectively quantified in spite of uncertainties in upscaling log data.
53

Bayesian model-based approaches with MCMC computation to some bioinformatics problems

Bae, Kyounghwa 29 August 2005 (has links)
Bioinformatics applications can address the transfer of information at several stages of the central dogma of molecular biology, including transcription and translation. This dissertation focuses on using Bayesian models to interpret biological data in bioinformatics, using Markov chain Monte Carlo (MCMC) for the inference method. First, we use our approach to interpret data at the transcription level. We propose a two-level hierarchical Bayesian model for variable selection on cDNA Microarray data. cDNA Microarray quantifies mRNA levels of a gene simultaneously so has thousands of genes in one sample. By observing the expression patterns of genes under various treatment conditions, important clues about gene function can be obtained. We consider a multivariate Bayesian regression model and assign priors that favor sparseness in terms of number of variables (genes) used. We introduce the use of different priors to promote different degrees of sparseness using a unified two-level hierarchical Bayesian model. Second, we apply our method to a problem related to the translation level. We develop hidden Markov models to model linker/non-linker sequence regions in a protein sequence. We use a linker index to exploit differences in amino acid composition between regions from sequence information alone. A goal of protein structure prediction is to take an amino acid sequence (represented as a sequence of letters) and predict its tertiary structure. The identification of linker regions in a protein sequence is valuable in predicting the three-dimensional structure. Because of the complexities of both models encountered in practice, we employ the Markov chain Monte Carlo method (MCMC), particularly Gibbs sampling (Gelfand and Smith, 1990) for the inference of the parameter estimation.
54

白銀期貨的價格限制-以馬可夫鏈蒙地卡羅方法分析 / price limits in the silver futures market: a MCMC approach

鄭仲均 Unknown Date (has links)
在這篇論文中,我們運用馬可夫鏈蒙地卡羅(MCMC)方法來估計沒有價格限制下的白銀期貨價格。接著我們採用FIGARCH模型來計算VaR值,以進而評估估計成果。在本文中我們分別對三種不同分配下的FIGARCH模型計算VaR值,而實證結果顯示出在沒有價格限制下,白銀期貨有較好的估計結果。 / In this paper, we try to implement the MCMC method to simulate the price of the silver futures without price limits. Then we compute the VaR by using the FIGARCH model because of the long memory properties in our data. There are three distributions we use to estimate model and compute VaR. The empirical results show that the silver futures without price limits performs better in computing in-sample VaR.
55

Computational Gains Via a Discretization of the Parameter Space in Individual Level Models of Infectious Disease

FANG, XUAN 13 January 2012 (has links)
The Bayesian Markov Chain Monte Carlo(MCMC) approach to inference is commonly used to estimate the parameters in spatial infectious disease models. However, such MCMC analyses can pose a hefty computational burden. Here we present new method to reduce the computing time cost in such MCMC analyses and study its usefulness. This method is based a round the discretization of the spatial parameters in the infectious disease model. A normal approximation of the posterior density of the output from the original model will be compared to that of the modified model, using the Kullback-Leibler(KL) divergence measure.
56

Continuous Model Updating and Forecasting for a Naturally Fractured Reservoir

Almohammadi, Hisham 16 December 2013 (has links)
Recent developments in instrumentation, communication and software have enabled the integration of real-time data into the decision-making process of hydrocarbon production. Applications of real-time data integration in drilling operations and horizontal-well lateral placement are becoming industry common practice. In reservoir management, the use of real-time data has been shown to be advantageous in tasks such as improving smart-well performance and in pressure-maintenance programs. Such capabilities allow for a paradigm change in which reservoir management can be looked at as a strategy that enables a semi-continuous process of model updates and decision optimizations instead of being periodic or reactive. This is referred to as closed-loop reservoir management (CLRM). Due to the complexity of the dynamic physical processes, large sizes, and huge uncertainties associated with reservoir description, continuous model updating is a large-scale problem with a highly dimensional parameter space and high computational costs. The need for an algorithm that is both feasible for practical applications and capable of generating reliable estimates of reservoir uncertainty is a key element in CLRM. This thesis investigates the validity of Markov Chain Monte Carlo (MCMC) sampling used in a Bayesian framework as an uncertainty quantification and model-updating tool suitable for real-time applications. A 3-phase, dual-porosity, dual-permeability reservoir model is used in a synthetic experiment. Continuous probability density functions of cumulative oil production for two cases with different model updating frequencies and reservoir maturity levels are generated and compared to a case with a known geology, i.e., truth case. Results show continuously narrowing ranges for cumulative oil production, with mean values approaching the truth case as model updating advances and the reservoir becomes more mature. To deal with MCMC sampling sensitivity to increasing numbers of observed measurements, as in the case of real-time applications, a new formulation of the likelihood function is proposed. Changing the likelihood function significantly improved chain convergence, chain mixing and forecast uncertainty quantification. Further, methods to validate the sampling quality and to judge the prior model for the MCMC process in real applications are advised.
57

Statistical Estimation of Physiologically-based Pharmacokinetic Models: Identifiability, Variation, and Uncertainty with an Illustration of Chronic Exposure to Dioxin and Dioxin-like-compounds.

Thompson, Zachary John 01 January 2012 (has links)
Assessment of human exposure to environmental chemicals is inherently subject to uncertainty and variability. There are data gaps concerning the inventory, source, duration, and intensity of exposure as well as knowledge gaps regarding pharmacokinetics in general. These gaps result in uncertainties in exposure assessment. The uncertainties compound further with variabilities due to population variations regarding stage of life, life style, and susceptibility, etc. Use of physiologically-based pharmacokinetic (PBPK) models promises to reduce the uncertainties and enhance extrapolation between species, between routes, from high to low dose, and from acute to chronic exposure. However, fitting PBPK models is challenging because of a large number of biochemical and physiological parameters to be estimated. Many of these model parameters are non-identifiable in that their estimates cannot be uniquely determined using statistical criteria. In practice some parameters are fixed in value and some determined through mathematical calibration or computer simulation. These estimated values are subject to substantial uncertainties. The first part of this paper illustrates the use of iteratively-reweighted-nonlinear-least-squares for fitting pharmacokinetic (PK) models, highlighting some common difficulties in obtaining statistical estimates of non-identifiable parameters and use bootstrap confidence interval to quantify uncertainties. Statistical estimation of parameters in physiologically based pharmacokinetic (PBPK) models is a relatively new area of research. Over the past decade or so PBPK models have become important and valuable tools in risk assessment as these models are used to describe the absorption, distribution, metabolism, and excretion of xenobiotics in a biological system such as the human or rat. Because these models incorporate information on biological processes, they are well equipped to describe the kinetic behaviors of chemicals and are useful for extrapolation across dose routes, between species, from high-to-low-doses, and across exposure scenarios. A PBPK model has been developed based on published models in the literature to describe the absorption, distribution, metabolism, and excretion of Dioxin and dioxin like compounds (DLCs) in the rat. Data from the National Toxicology Program (NTP) two year experiment TR-526 is used to illustrate model fitting and statistical estimation of the parameters. Integrating statistical methods into risk assessments is the most efficient way to characterize the variation in parameter values. In this dissertation a Markov Chain Monte Carlo (MCMC) method is used to estimate select parameters of the system and to describe the variation of the select parameters.
58

Entry and exit dynamics in the Austrian manufacturing industries

Hölzl, Werner, Soegner, Leopold January 2004 (has links) (PDF)
This article investigates the determinants of entry and exit in the Austrian manufacturing sector based on 1981 to 1994 data. We study the response of entry, exit and other indicators of firm dynamics to changes in average plant size, size heterogeneity, concentration, incentives and vertical integration. By applying Bayesian simulation methods we estimate random coefficient models and study the symmetry of the determinants of entry and exit. Our empirical analysis shows that entry and exit rates are driven by the same determinants. The impacts of these determinants are nearly homogeneous for both, entry rates and exits rates, respectively. Moreover, we find (i) that changes in average plant size, size heterogeneity and concentration are not symmetric with respect to entry and exit, (ii) that changes in the growth of sales is weakly symmetric and (iii) that the growth rate of employment is strongly asymmetric across industries in Austrian manufacturing. Furthermore, we infer from the data that the turnover of firms influences the changes in the number of competitors. Low entry rates go hand in hand with low net entry rates and a low turnover. (author's abstract) / Series: Working Papers Series "Growth and Employment in Europe: Sustainability and Competitiveness"
59

Bayesian Analysis of Switching ARCH Models

Kaufmann, Sylvia, Frühwirth-Schnatter, Sylvia January 2000 (has links) (PDF)
We consider a time series model with autoregressive conditional heteroskedasticity that is subject to changes in regime. The regimes evolve according to a multistate latent Markov switching process with unknown transition probabilities, and it is the constant in the variance process of the innovations that is subject to regime shifts. The joint estimation of the latent process and all model parameters is performed within a Bayesian framework using the method of Markov Chain Monte Carlo simulation. We perform model selection with respect to the number of states and the number of autoregressive parameters in the variance process using Bayes factors and model likelihoods. To this aim, the model likelihood is estimated by combining the candidate's formula with importance sampling. The usefulness of the sampler is demonstrated by applying it to the dataset previously used by Hamilton and Susmel who investigated models with switching autoregressive conditional heteroskedasticity using maximum likelihood methods. The paper concludes with some issues related to maximum likelihood methods, to classical model select ion, and to potential straightforward extensions of the model presented here. (author's abstract) / Series: Forschungsberichte / Institut für Statistik
60

Bayesian Estimation of Material Properties in Case of Correlated and Insufficient Data

Giugno, Matteo 02 October 2013 (has links)
Identification of material properties has been highly discussed in recent times thanks to better technology availability and its application to the field of experimental mechanics. Bayesian approaches as Markov-chain Monte Carlo (MCMC) methods demonstrated to be reliable and suitable tools to process data, describing probability distributions and uncertainty bounds for investigated parameters in absence of explicit inverse analytical expressions. Though it is necessary to repeat experiments multiple times for good estimations, this might be not always feasible due to possible incurring limitations: the thesis addresses the problem of material properties estimation in presence of correlated and insufficient data, resulting in multivariate error modeling and high sample covariance matrix instability. To recover from the lack of information about the true covariance we analyze two different methodologies: first the hierarchical covariance modeling is investigated, then a method based on covariance shrinkage is employed. A numerical study comparing both approaches and employing finite element analysis within MCMC iterations will be presented, showing how the method based on covariance shrinkage is more suitable to post-process data for the range of problems under investigation.

Page generated in 0.0446 seconds