Spelling suggestions: "subject:"[een] MARKOV CHAIN"" "subject:"[enn] MARKOV CHAIN""
281 |
Evolution on Arbitrary Fitness Landscapes when Mutation is WeakMcCandlish, David Martin January 2012 (has links)
<p>Evolutionary dynamics can be notoriously complex and difficult to analyze. In this dissertation I describe a population genetic regime where the dynamics are simple enough to allow a relatively complete and elegant treatment. Consider a haploid, asexual population, where each possible genotype has been assigned a fitness. When mutations enter a population sufficiently rarely, we can model the evolution of this population as a Markov chain where the population jumps from one genotype to another at the birth of each new mutant destined for fixation. Furthermore, if the mutation rates are assigned in such a manner that the Markov chain is reversible when all genotypes are assigned the same fitness, then it is still reversible when genotypes are assigned differing fitnesses. </p><p>The key insight is that this Markov chain can be analyzed using the spectral theory of finite-state, reversible Markov chains. I describe the spectral decomposition of the transition matrix and use it to build a general framework with which I address a variety of both classical and novel topics. These topics include a method for creating low-dimensional visualizations of fitness landscapes; a measure of how easy it is for the evolutionary process to `find' a specific genotype or phenotype; the index of dispersion of the molecular clock and its generalizations; a definition for the neighborhood of a genotype based on evolutionary dynamics; and the expected fitness and number of substitutions that have occurred given that a population has been evolving on the fitness landscape for a given period of time. I apply these various analyses to both a simple one-codon fitness landscape and to a large neutral network derived from computational RNA secondary structure predictions.</p> / Dissertation
|
282 |
Random Walks with Elastic and Reflective Lower BoundariesDevore, Lucas Clay 01 December 2009 (has links)
No description available.
|
283 |
Quasi Importance SamplingHörmann, Wolfgang, Leydold, Josef January 2005 (has links) (PDF)
There arise two problems when the expectation of some function with respect to a nonuniform multivariate distribution has to be computed by (quasi-) Monte Carlo integration: the integrand can have singularities when the domain of the distribution is unbounded and it can be very expensive or even impossible to sample points from a general multivariate distribution. We show that importance sampling is a simple method to overcome both problems. (author's abstract) / Series: Preprint Series / Department of Applied Statistics and Data Processing
|
284 |
Essays on econometric modeling of subjective perceptions of risks in environment and human healthNguyen, To Ngoc 15 May 2009 (has links)
A large body of literature studies the issues of the option price and other ex-ante
welfare measures under the microeconomic theory to valuate reductions of risks inherent
in environment and human health. However, it does not offer a careful discussion of how
to estimate risk reduction values using data, especially the modeling and estimating
individual perceptions of risks present in the econometric models. The central theme of
my dissertation is the approaches taken for the empirical estimation of probabilistic risks
under alternative assumptions about individual perceptions of risk involved: the
objective probability, the Savage subjective probability, and the subjective distributions
of probability. Each of these three types of risk specifications is covered in one of the
three essays.
The first essay addresses the problem of empirical estimation of individual
willingness to pay for recreation access to public land under uncertainty. In this essay I
developed an econometric model and applied it to the case of lottery-rationed hunting
permits. The empirical result finds that the model correctly predicts the responses of
84% of the respondents in the Maine moose hunting survey.
The second essay addresses the estimation of a logit model for individual binary
choices that involve heterogeneity in subjective probabilities. For this problem, I
introduce the use of the hierarchical Bayes to estimate, among others, the parameters of
distribution of subjective probabilities. The Monte Carlo study finds the estimator
asymptotically unbiased and efficient. The third essay addresses the problem of modeling perceived mortality risks
from arsenic concentrations in drinking water. I estimated a formal model that allows for
ambiguity about risk. The empirical findings revealed that perceived risk was positively
associated with exposure levels and also related individuating factors, in particular
smoking habits and one’s current health status. Further evidence was found that the
variance of the perceived risk distribution is non-zero.
In all, the three essays contribute methodological approaches and provide
empirical examples for developing empirical models and estimating value of risk
reductions in environment and human health, given the assumption about the
individual’s perceptions of risk, and accordingly, the reasonable specifications of risks
involved in the models.
|
285 |
History matching and uncertainty quantificiation using sampling methodMa, Xianlin 15 May 2009 (has links)
Uncertainty quantification involves sampling the reservoir parameters correctly from a
posterior probability function that is conditioned to both static and dynamic data.
Rigorous sampling methods like Markov Chain Monte Carlo (MCMC) are known to
sample from the distribution but can be computationally prohibitive for high resolution
reservoir models. Approximate sampling methods are more efficient but less rigorous for
nonlinear inverse problems. There is a need for an efficient and rigorous approach to
uncertainty quantification for the nonlinear inverse problems.
First, we propose a two-stage MCMC approach using sensitivities for quantifying
uncertainty in history matching geological models. In the first stage, we compute the
acceptance probability for a proposed change in reservoir parameters based on a
linearized approximation to flow simulation in a small neighborhood of the previously
computed dynamic data. In the second stage, those proposals that passed a selected
criterion of the first stage are assessed by running full flow simulations to assure the
rigorousness.
Second, we propose a two-stage MCMC approach using response surface models for
quantifying uncertainty. The formulation allows us to history match three-phase flow
simultaneously. The built response exists independently of expensive flow simulation,
and provides efficient samples for the reservoir simulation and MCMC in the second
stage. Third, we propose a two-stage MCMC approach using upscaling and non-parametric
regressions for quantifying uncertainty. A coarse grid model acts as a surrogate for the
fine grid model by flow-based upscaling. The response correction of the coarse-scale
model is performed by error modeling via the non-parametric regression to approximate
the response of the computationally expensive fine-scale model.
Our proposed two-stage sampling approaches are computationally efficient and
rigorous with a significantly higher acceptance rate compared to traditional MCMC
algorithms.
Finally, we developed a coarsening algorithm to determine an optimal reservoir
simulation grid by grouping fine scale layers in such a way that the heterogeneity
measure of a defined static property is minimized within the layers. The optimal number
of layers is then selected based on a statistical analysis.
The power and utility of our approaches have been demonstrated using both
synthetic and field examples.
|
286 |
Thermo-Hydrological-Mechanical Analysis of a Clay Barrier for Radioactive Waste Isolation: Probabilistic Calibration and Advanced ModelingDontha, Lakshman 2012 May 1900 (has links)
The engineered barrier system is a basic element in the design of repository to isolate high level radioactive waste (HLW). In this system, the clay barrier plays a prominent role in dispersing the heat generated from the waste, reduce the flow of pore water from the host rock, and maintaining the structural stability of the waste canister. The compacted expansive clay (generally bentonite blocks) is initially in unsaturated state. During the life time of the repository, the barrier will undergo different coupled thermal, hydrological and mechanical (THM) phenomena due to heating (from the heat-emitting nuclear waste) and hydration (from the saturated host rock). The design of nuclear waste disposal requires the prediction of the long term barrier behavior (i.e. hundred or thousand years), so numerical modeling is a basic component of the repository design. The numerical analyses are performed using mathematical THM formulation and the associated numerical code. Constitutive models are an essential part of the numerical simulations. Those constitutive models represent the intrinsic behavior of the material for the individual physical phenomenon (i.e. thermal, hydraulic and mechanical). Deterministic analyses have shown the potential of such mathematical formulations to describe the physical behavior of the engineered barrier system. However, the effect of the inherent uncertainties associated with the different constitutive models on the global behavior of the isolation system has not been explored yet.
The first part of this thesis is related to application of recent probabilistic methods to understand and assess the impact of uncertainties on the global THM model response. Experimental data associated with the FEBEX project has been adopted for the case study presented in this thesis. CODE_BRIGHT, a fully coupled THM finite element program, is used to perform the numerical THM analysis.
The second part of this thesis focuses on the complex mechanical behavior observed in a barrier material subjected (during 5 years) to heating and hydration under actual repository conditions The studied experiment is the (ongoing) full scale in-situ FEBEX test at Grimsel test site, Switzerland. A partial dismantling of this experiment has allowed the inspection of the barrier material subjected to varying stresses due to hydration and heating. The clay underwent both elastic and plastic volumetric deformations at different suction and temperature levels with changes in the pre-consolidation pressure and voids ratio that are difficult to explain with conventional models. In this thesis a double structure elasto plastic model is proposed to study the mechanical behavior of this barrier material. The numerical modeling was performed with CODE_BRIGHT. The study shows that the double structure model explains satisfactorily the observed changes in the mechanical behavior of the clay material.
|
287 |
Energy-Efficient Tree Splitting Algorithm in Wireless Sensor NetworksShiau, You-cheng 25 July 2007 (has links)
In this thesis, we propose a power saving strategy based on tree splitting algorithm in wireless sensor network with multiple packet reception. We concentrate on the case that maximum queue size is 1. We derive both analytical results and simulation results. We use theory of Markov chain to analyze the evolution of the system state. In addition, we propose to use Renewal theory to calculate the throughput. Furthermore, we obtain the average system size, the packet blocking probability, and the average packet delay. Because the network model is distributed, we can¡¦t understand the state of network all the time. So we use the length of last collision resolution cycle to predict the length of next cycle, and determine the sleeping time by the predicted length of next cycle to implement power saving. At last we will use the simulation result to show the performance of our power saving strategy.
|
288 |
A Decision Analytic Model For Early Stage Breast Cancer Patients: Lumpectomy Vs MastectomyElele, Tugba 01 September 2006 (has links) (PDF)
The purpose of this study was to develop a decision model for early-stage breast cancer patients. This model provides an opportunity for comparing two main treatment options, mastectomy and lumpectomy, with respect to quality of life by making use of Decision Theoretic Techniques.
A Markov chain was constructed to project the clinical history of breast carcinoma following surgery. Then, health states used in the model were characterized by transition probabilities and utilities for quality of life. A Multi Attribute Utility Model was developed for outcome evaluation. This study was performed on the sample population of female university students, and utilities were elicited from these healthy volunteers. The results yielded by Multi Attribute Utility Model were validated by using Von Neumann-Morgenstern Standard Gamble technique. Finally, Monte Carlo Simulation was utilized in Treeage-Pro 2006 Suit software program in order to solve model and calculate expected utility value generated by each treatment option. The results showed that lumpectomy is more favorable for people who participated in this study. Sensitivity analysis on transition probabilities to local recurrence and salvaged states was performed and two threshold values were observed. Additionally, sensitivity analysis on utilities showed that the model was more sensitive to no evidence of disease state / however, was not sensitive to utilities of local recurrence and salvaged states.
|
289 |
A Simulation Study On Marginalized Transition Random Effects Models For Multivariate Longitudinal Binary DataYalcinoz, Zerrin 01 May 2008 (has links) (PDF)
In this thesis, a simulation study is held and a statistical model is fitted to the simulated data. This data is assumed to be the satisfaction of the customers who withdraw their salary from a particular bank. It is a longitudinal data which has bivariate and binary response. It is assumed to be collected from 200 individuals at four different time points. In such data sets, two types of dependence -the dependence within subject measurements and the dependence between responses- are important and these are considered in the model. The model is Marginalized Transition Random Effects Models, which has three levels. The first level measures the effect of covariates on responses, the second level accounts for temporal changes, and the third level measures the difference between individuals. Markov Chain Monte Carlo methods are used for the model fit. In the simulation study, the changes between the estimated values and true parameters are searched under two conditions, when the model is correctly specified or not. Results suggest that the better convergence is obtained with the full model. The third level which observes the individual changes is more sensitive to the model misspecification than the other levels of the model.
|
290 |
A Two-sided Cusum For First-order Integer-valued Autoregressive Processes Of Poisson CountsYontay, Petek 01 July 2011 (has links) (PDF)
Count data are often encountered in manufacturing and service industries due to ease of data collection. These counts can be useful in process monitoring to detect shifts of a process from an in-control state to various out-of-control states. It is usually assumed that the observations are independent and identically distributed. However, in practice, observations may be autocorrelated and this may adversely affect the performance of the control charts developed under the assumption of independence. In this thesis, the cumulative sum (CUSUM) control chart for monitoring autocorrelated processes of counts is investigated. To describe the autocorrelation structure of counts, a Poisson integer-valued autoregressive moving average model of order 1, Poisson INAR(1), is employed. Changes in the process mean in both positive and negative directions are taken into account while designing the CUSUM chart. A trivariate Markov Chain approach is utilized for evaluating the performance of the chart.
|
Page generated in 0.0371 seconds