Spelling suggestions: "subject:"1mportance ampling"" "subject:"1mportance campling""
1 |
Cross-validatory Model Comparison and Divergent Regions Detection using iIS and iWAIC for Disease Mapping2015 March 1900 (has links)
The well-documented problems associated with mapping raw rates of disease have resulted in an increased use of Bayesian hierarchical models to produce maps of "smoothed'' estimates of disease rates. Two statistical problems arise in using Bayesian hierarchical models for disease mapping. The first problem is in comparing goodness of fit of various models, which can be used to test different hypotheses. The second problem is in identifying outliers/divergent regions with unusually high or low residual risk of disease, or those whose disease rates are not well fitted. The results of outlier detection may generate further hypotheses as to what additional covariates might be necessary for explaining the disease. Leave-one-out cross-validatory (LOOCV) model assessment has been used for these two problems. However, actual LOOCV is time-consuming. This thesis introduces two methods, namely iIS and iWAIC, for approximating LOOCV, using only Markov chain samples simulated from a posterior distribution based on a full data set. In iIS and iWAIC, we first integrate the latent variables without reference to holdout observation, then apply IS and WAIC approximations to the integrated predictive density and evaluation function. We apply iIS and iWAIC to two real data sets. Our empirical results show that iIS and iWAIC can provide significantly better estimation of LOOCV model assessment than existing methods including DIC, Importance Sampling, WAIC, posterior checking and Ghosting methods.
|
2 |
On large deviations and design of efficient importance sampling algorithmsNyquist, Pierre January 2014 (has links)
This thesis consists of four papers, presented in Chapters 2-5, on the topics large deviations and stochastic simulation, particularly importance sampling. The four papers make theoretical contributions to the development of a new approach for analyzing efficiency of importance sampling algorithms by means of large deviation theory, and to the design of efficient algorithms using the subsolution approach developed by Dupuis and Wang (2007). In the first two papers of the thesis, the random output of an importance sampling algorithm is viewed as a sequence of weighted empirical measures and weighted empirical processes, respectively. The main theoretical results are a Laplace principle for the weighted empirical measures (Paper 1) and a moderate deviation result for the weighted empirical processes (Paper 2). The Laplace principle for weighted empirical measures is used to propose an alternative measure of efficiency based on the associated rate function.The moderate deviation result for weighted empirical processes is an extension of what can be seen as the empirical process version of Sanov's theorem. Together with a delta method for large deviations, established by Gao and Zhao (2011), we show moderate deviation results for importance sampling estimators of the risk measures Value-at-Risk and Expected Shortfall. The final two papers of the thesis are concerned with the design of efficient importance sampling algorithms using subsolutions of partial differential equations of Hamilton-Jacobi type (the subsolution approach). In Paper 3 we show a min-max representation of viscosity solutions of Hamilton-Jacobi equations. In particular, the representation suggests a general approach for constructing subsolutions to equations associated with terminal value problems and exit problems. Since the design of efficient importance sampling algorithms is connected to such subsolutions, the min-max representation facilitates the construction of efficient algorithms. In Paper 4 we consider the problem of constructing efficient importance sampling algorithms for a certain type of Markovian intensity model for credit risk. The min-max representation of Paper 3 is used to construct subsolutions to the associated Hamilton-Jacobi equation and the corresponding importance sampling algorithms are investigated both theoretically and numerically. The thesis begins with an informal discussion of stochastic simulation, followed by brief mathematical introductions to large deviations and importance sampling. / <p>QC 20140424</p>
|
3 |
Likelihood-based procedures for obtaining confidence intervals of disease Loci with general pedigree dataWan, Shuyan 30 November 2006 (has links)
No description available.
|
4 |
Importance Resampling for Global IlluminationTalbot, Justin F. 16 September 2005 (has links) (PDF)
This thesis develops a generalized form of Monte Carlo integration called Resampled Importance Sampling. It is based on the importance resampling sample generation technique. Resampled Importance Sampling can lead to significant variance reduction over standard Monte Carlo integration for common rendering problems. We show how to select the importance resampling parameters for near optimal variance reduction. We also combine RIS with stratification and with Multiple Importance Sampling for further variance reduction. We demonstrate the robustness of this technique on the direct lighting problem and achieve up to a 33% variance reduction over standard techniques. We also suggest using RIS as a default BRDF sampling technique.
|
5 |
Driving efficiency in design for rare events using metamodeling and optimizationMorrison, Paul 08 April 2016 (has links)
Rare events have very low probability of occurrence but can have significant impact. Earthquakes, volcanoes, and stock market crashes can have devastating impact on those affected. In industry, engineers evaluate rare events to design better high-reliability systems. The objective of this work is to increase efficiency in design optimization for rare events using metamodeling and variance reduction techniques. Opportunity exists to increase deterministic optimization efficiency by leveraging Design of Experiments to build an accurate metamodel of the system which is less resource intensive to evaluate than the real system. For computationally expensive models, running many trials will impede fast design iteration. Accurate metamodels can be used in place of these expensive models to probabilistically optimize the system for efficient quantification of rare event risk. Monte Carlo is traditionally used for this risk quantification but variance reduction techniques such as importance sampling allow accurate quantification with fewer model evaluations.
Metamodel techniques are the thread that tie together deterministic optimization using Design of Experiments and probabilistic optimization using Monte Carlo and variance reduction. This work will explore metamodeling theory and implementation, and outline a framework for efficient deterministic and probabilistic system optimization. The overall conclusion is that deterministic and probabilistic simulation can be combined through metamodeling and used to drive efficiency in design optimization.
Applications are demonstrated on a gas turbine combustion autoignition application where user controllable independent variables are optimized in mean and variance to maximize system performance while observing a constraint on allowable probability of a rare autoignition event.
|
6 |
Importance Sampling for Reinforcement Learning with Multiple ObjectivesShelton, Christian Robert 01 August 2001 (has links)
This thesis considers three complications that arise from applying reinforcement learning to a real-world application. In the process of using reinforcement learning to build an adaptive electronic market-maker, we find the sparsity of data, the partial observability of the domain, and the multiple objectives of the agent to cause serious problems for existing reinforcement learning algorithms. We employ importance sampling (likelihood ratios) to achieve good performance in partially observable Markov decision processes with few data. Our importance sampling estimator requires no knowledge about the environment and places few restrictions on the method of collecting data. It can be used efficiently with reactive controllers, finite-state controllers, or policies with function approximation. We present theoretical analyses of the estimator and incorporate it into a reinforcement learning algorithm. Additionally, this method provides a complete return surface which can be used to balance multiple objectives dynamically. We demonstrate the need for multiple goals in a variety of applications and natural solutions based on our sampling method. The thesis concludes with example results from employing our algorithm to the domain of automated electronic market-making.
|
7 |
New Importance Sampling DensitiesHörmann, Wolfgang January 2005 (has links) (PDF)
To compute the expectation of a function with respect to a multivariate distribution naive Monte Carlo is often not feasible. In such cases importance sampling leads to better estimates than the rejection method. A new importance sampling distribution, the product of one-dimensional table mountain distributions with exponential tails, turns out to be flexible and useful for Bayesian integration problems. To obtain a heavy-tailed importance sampling distribution a new radius transform for the above distribution is suggested. Together with a linear transform the new importance sampling distributions lead to simple and fast integration algorithms with reliable error bounds. (author's abstract) / Series: Preprint Series / Department of Applied Statistics and Data Processing
|
8 |
Advances in Cross-Entropy MethodsThomas Taimre Unknown Date (has links)
The cross-entropy method is an established technique for solving difficult estimation, simulation, and optimisation problems. The method has its origins in an adaptive importance sampling procedure for rare-event estimation published by R. Y. Rubinstein in 1997. In that publication, the adaptive procedure produces a parametric probability density function whose parameters minimise the variance of the associated likelihood ratio estimator. This variance minimisation can also be viewed as minimising a measure of divergence to the minimum-variance importance sampling density over all members of the parametric family in question. Soon thereafter it was realised that the same adaptive importance sampling procedure could be used to solve combinatorial optimisation problems by viewing the set of solutions to the optimisation problem as a rare-event. This realisation led to the debut of the cross-entropy method in 1999, where it was introduced as a modification to the existing adaptive importance sampling procedure, with a different choice of directed divergence measure, in particular, the Kullback-Leibler cross-entropy. The contributions of this thesis are threefold. Firstly, in a review capacity, it provides an up-to-date consolidation of material on the cross-entropy method and its generalisations, as well as a collation of background material on importance sampling and Monte Carlo methods. The reviews are elucidated with original commentary and examples. Secondly, two new major applications of the cross-entropy methodology to optimisation problems are presented, advancing the boundary of knowledge on cross-entropy in the applied arena. Thirdly, two contributions to the methodological front are (a) an original extension of the generalised cross-entropy framework which enables one to construct state- and time-dependent importance sampling algorithms, and (b) a new algorithm for counting solutions to difficult binary-encoded problems.
|
9 |
Advances in Cross-Entropy MethodsThomas Taimre Unknown Date (has links)
The cross-entropy method is an established technique for solving difficult estimation, simulation, and optimisation problems. The method has its origins in an adaptive importance sampling procedure for rare-event estimation published by R. Y. Rubinstein in 1997. In that publication, the adaptive procedure produces a parametric probability density function whose parameters minimise the variance of the associated likelihood ratio estimator. This variance minimisation can also be viewed as minimising a measure of divergence to the minimum-variance importance sampling density over all members of the parametric family in question. Soon thereafter it was realised that the same adaptive importance sampling procedure could be used to solve combinatorial optimisation problems by viewing the set of solutions to the optimisation problem as a rare-event. This realisation led to the debut of the cross-entropy method in 1999, where it was introduced as a modification to the existing adaptive importance sampling procedure, with a different choice of directed divergence measure, in particular, the Kullback-Leibler cross-entropy. The contributions of this thesis are threefold. Firstly, in a review capacity, it provides an up-to-date consolidation of material on the cross-entropy method and its generalisations, as well as a collation of background material on importance sampling and Monte Carlo methods. The reviews are elucidated with original commentary and examples. Secondly, two new major applications of the cross-entropy methodology to optimisation problems are presented, advancing the boundary of knowledge on cross-entropy in the applied arena. Thirdly, two contributions to the methodological front are (a) an original extension of the generalised cross-entropy framework which enables one to construct state- and time-dependent importance sampling algorithms, and (b) a new algorithm for counting solutions to difficult binary-encoded problems.
|
10 |
Μελέτη μεθόδων διαφορισμού και πολύπλεξης για ασύρματα δίκτυα επικοινωνιώνΝτούνη, Γεωργία 21 December 2012 (has links)
Αντικείμενο της παρούσας διπλωματικής εργασίας είναι η μελέτη μεθόδων διαφορισμού και πολύπλεξης που μπορούν να χρησιμοποιηθούν στα ασύρματα δίκτυα επικοινωνιών. Αρχικά, γίνεται μια ανασκόπηση της βασικής θεωρίας με τη ντετερμινιστική και στοχαστική μοντελοποίηση του ασύρματου τηλεπικοινωνιακού καναλιού. Αφού παρουσιάζονται οι βασικές τεχνικές διαμόρφωσης, ακολουθεί η σύγκριση του καναλιού Λευκού Προσθετικού Γκαουσιανού Θορύβου (AWGN) με το κανάλι επίπεδων διαλείψεων (flat fading) Rayleigh. Στη συνέχεια, μελετάται ο διαφορισμός στο χώρο με χρήση πολλαπλών κεραιών στον πομπό ή/και στο δέκτη και παρουσιάζονται βασικές τεχνικές μετάδοσης και ανίχνευσης. Ξεκινώντας από συστήματα SIMO και MISO με διαφορισμό μόνο στο δέκτη και στον πομπό, αντίστοιχα, καταλήγουμε στο σύστημα MIMO που εμφανίζει το μεγαλύτερο ενδιαφέρον. Τέλος, τα θεωρητικά αποτελέσματα που προκύπτουν επαληθεύονται με τη χρήση προσομοιώσεων, για τις οποίες εφαρμόζεται η τεχνική σημαίνουσας δειγματοληψίας (Importance Sampling). / The main goal of this diploma thesis is the study of diversity and multiplexing techniques which are employed in wireless communication systems. First of all, the deterministic and stochastic model of the wireless channel are introduced. After the presentation of the basic modulation techniques, the Additive White Gaussian Noise (AWGN) channel is compared with the Flat Fading Rayleigh channel. Space diversity aims at improving the performance of wireless systems in fading environments. More precisely, basic transmission and detection techniques are analyzed for systems with multiple antennas at the transmitter or/and the receiver. Beginning from SIMO and MISO systems the thesis concludes with the study of MIMO systems which are the most interesting. Finally, the theoretical results are confirmed with simulations developed with a fast simulation method called importance sampling.
|
Page generated in 0.1746 seconds