• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 160
  • 68
  • 35
  • 16
  • 13
  • 10
  • 7
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • Tagged with
  • 365
  • 103
  • 65
  • 54
  • 52
  • 48
  • 45
  • 41
  • 41
  • 41
  • 38
  • 38
  • 34
  • 31
  • 31
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Chinese Basic Pension Substitution Rate: A Monte Carlo Demonstration of the Individual Account Model

Dong, Bei, Zhang, Ling, Lu, Xuan January 2008 (has links)
At the end of 2005, the State Council of China passed ”The Decision on adjusting the Individual Account of Basic Pension System”, which adjusted the individual account in the 1997 basic pension system. In this essay, we will analyze the adjustment above, and use Life Annuity Actuarial Theory to establish the basic pension substitution rate model. Monte Carlo simulation is also used to prove the rationality of the model. Some suggestions are put forward associated with the substitution rate according to the current policy.
42

Discrete Preisach Model for the Superelastic Response of Shape Memory Alloys

Doraiswamy, Srikrishna 2010 December 1900 (has links)
The aim of this work is to present a model for the superelastic response of Shape Memory Alloys (SMAs) by developing a Preisach Model with thermodynamics basis. The special features of SMA superelastic response is useful in a variety of applications (eg. seismic dampers and arterial stents). For example, under seismic loads the SMA dampers undergo rapid loading{unloading cycles, thus going through a number of internal hysteresis loops, which are responsible for dissipating the vibration energy. Therefore the design for such applications requires the ability to predict the response, particularly internal loops. It is thus intended to develop a model for the superelastic response which is simple, computationally fast and can predict internal loops. The key idea here is to separate the elastic response of SMAs from the dissipative response and apply a Preisach Model to the dissipative response as opposed to the popular notion of applying the Preisach Model to the stress{strain response directly. Such a separation allows for the better prediction of internal hysteresis, avoids issues due to at/negative slopes in the stress{strain plot, and shows good match with experimental data, even when minimal input is given to the model. The model is developed from a Gibbs Potential, which allows us to compute a driving force for the underlying phase transformation in the superelastic response. The hysteresis between the driving force for transformation and the extent of transformation (volume fraction of martensite) is then used with a Preisach model. The Preisach model parameters are identi ed using a least squares approach. ASTM Standards for the testing of NiTi wires (F2516-07^sigma 2), are used for the identi cation of the parameters in the Gibbs Potential. The simulations are run using MATLAB R . Results under di erent input conditions are discussed. It is shown that the predicted response shows good agreement with the experimental data. A couple of attempts at extending the model to bending and more complex response of SMAs is also discussed.
43

Gibbs sampling's application in censored regression model and unit root test

Wu, Wei-Lun 02 September 2005 (has links)
Abstract Generally speaking, when dealing with some data, our analysis will be limited because of the given data was incompletely or hidden. And these kinds of errors in calculation will arrive at a statistics answer. This thesis adopts an analysis based on the Gibbs sampling trying to recover the part of hidden data. Since we found out whether time series is unit root or not, the effects of the simulated series will be similar to the true value. After observing the differences between the hidden data and the recovered data in unit root, we noticed that the hidden data has a bigger size and a weakened power over the recovered data. Finally, as an example, we give the unsecured loans at the Japanese money market to prove our issues by analyzing the data from January, 1999 to July, 2004. Since we found out that the numerical value of loan is zero at several months these past several years. In order to observe the Japanese money market, if we substitute the data of zero loan and use the traditional way to inspect unit root without taking model of average value into account, the result will be I(0). And if we simulate the hidden data with Gibbs sampling and substitute the data to inspect the Japanese money market without taking model of average value into account, the result will be I(0) also. But if we take model of average value into account, the of the Japanese Money Market will be I(1). And if we simulate the hidden data with Gibbs sampling and substitute the data to inspect the Japanese money market, the result will be I(I) also.
44

Free energy functions in protein structural stability and folding kinetics /

Morozov, Alexandre V., January 2003 (has links)
Thesis (Ph. D.)--University of Washington, 2003. / Vita. Includes bibliographical references (p. 96-115).
45

A collection of Bayesian models of stochastic failure processes

Kirschenmann, Thomas Harold 06 November 2013 (has links)
Risk managers currently seek new advances in statistical methodology to better forecast and quantify uncertainty. This thesis comprises a collection of new Bayesian models and computational methods which collectively aim to better estimate parameters and predict observables when data arise from stochastic failure processes. Such data commonly arise in reliability theory and survival analysis to predict failure times of mechanical devices, compare medical treatments, and to ultimately make well-informed risk management decisions. The collection of models proposed in this thesis advances the quality of those forecasts by providing computational modeling methodology to aid quantitative based decision makers. Through these models, a reliability expert will have the ability: to model how future decisions affect the process; to impose his prior beliefs on hazard rate shapes; to efficiently estimate parameters with MCMC methods; to incorporate exogenous information in the form of covariate data using Cox proportional hazard models; to utilize nonparametric priors for enhanced model flexibility. Managers are often forced to make decisions that affect the underlying distribution of a stochastic process. They regularly make these choices while lacking a mathematical model for how the process may itself depend significantly on their decisions. The first model proposed in this thesis provides a method to capture this decision dependency; this is used to make an optimal decision policy in the future, utilizing the interactions of the sequences of decisions. The model and method in this thesis is the first to directly estimate decision dependency in a stochastic process with the flexibility and power of the Bayesian formulation. The model parameters are estimated using an efficient Markov chain Monte Carlo technique, leading to predictive probability densities for the stochastic process. Using the posterior distributions of the random parameters in the model, a stochastic optimization program is solved to determine the sequence of decisions that minimise a cost-based objective function over a finite time horizon. The method is tested with artificial data and then used to model maintenance and failure time data from a condenser system at the South Texas Project Nuclear Operating Company (STPNOC). The second and third models proposed in this thesis offer a new way for survival analysts and reliability engineers to utilize their prior beliefs regarding the shape of hazard rate functions. Two generalizations of Weibull models have become popular recently, the exponentiated Weibull and the modified Weibull densities. The popularity of these models is largely due to the flexible hazard rate functions they can induce, such as bathtub, increasing, decreasing, and unimodal shaped hazard rates. These models are more complex than the standard Weibull, and without a Bayesian approach, one faces difficulties using traditional frequentist techniques to estimate the parameters. This thesis develops stylized families of prior distributions that should allow engineers to model their beliefs based on the context. Both models are first tested on artificial data and then compared when modeling a low pressure switch for a containment door at the STPNOC in Bay City, TX. Additionally, survival analysis is performed with these models using a famous collection of censored data about leukemia treatments. Two additional models are developed using the exponentiated and modified Weibull hazard functions as a baseline distribution to implement Cox proportional hazards models, allowing survival analysts to incorporate additional covariate information. Two nonparametric methods for estimating survival functions are compared using both simulated and real data from cancer treatment research. The quantile pyramid process is compared to Polya tree priors and is shown to have a distinct advantage due to the need for choosing a distribution upon which to center a Polya tree. The Polya tree and the quantile pyramid appear to have effectively the same accuracy when the Polya tree has a very well-informed choice of centering distribution. That is rarely the case, however, and one must conclude that the quantile pyramid process is at least as effective as Polya tree priors for modeling unknown situations. / text
46

Bayesian analysis of the heterogeneity model

Frühwirth-Schnatter, Sylvia, Tüchler, Regina, Otter, Thomas January 2002 (has links) (PDF)
In the present paper we consider Bayesian estimation of a finite mixture of models with random effects which is also known as the heterogeneity model. First, we discuss the properties of various MCMC samplers that are obtained from full conditional Gibbs sampling by grouping and collapsing. Whereas full conditional Gibbs sampling turns out to be sensitive to the parameterization chosen for the mean structure of the model, the alternative sampler is robust in this respect. However, the logical extension of the approach to the sampling of the group variances does not further increase the efficiency of the sampler. Second, we deal with the identifiability problem due to the arbitrary labeling within the model. Finally, a case study involving metric Conjoint analysis serves as a practical illustration. (author's abstract) / Series: Report Series SFB "Adaptive Information Systems and Modelling in Economics and Management Science"
47

Propagation of Gibbsiannes for infinite-dimensional gradient Brownian diffusions

Roelly, Sylvie, Dereudre, David January 2004 (has links)
We study the (strong-)Gibbsian character on R Z d of the law at time t of an infinitedimensional gradient Brownian diffusion / when the initial distribution is Gibbsian.
48

Applied descriptive analysis of the preaching styles of three contemporary preachers

Seidler, Scott K. January 2006 (has links)
Thesis (D. Min.)--Trinity Evangelical Divinity School, 2006. / Abstract. Includes bibliographical references (leaves 175-176).
49

Computerintensive statistische Methoden : Gibbs Sampling in Regressionsmodellen /

Krause, Andreas Eckhard. January 1994 (has links)
Diss. Staatswiss. Basel, 1994. / Register. Literaturverz.
50

Some new models for image compression

Aslam, Muhammad, January 1900 (has links)
Thesis (Ph. D.)--West Virginia University, 2006. / Title from document title page. Document formatted into pages; contains xiv, 85 p. : ill. Includes abstract. Includes bibliographical references (p. 83-85).

Page generated in 0.0283 seconds