Spelling suggestions: "subject:"markov chain fonte carlo"" "subject:"markov chain fonte sarlo""
1 |
Sampling approaches in Bayesian computational statistics with RSun, Wenwen 27 August 2010 (has links)
Bayesian analysis is definitely different from the classic statistical methods. Although, both of them use subjective ideas, it is used in the selection of models in the classic statistical methods, rather than as an explicit part in Bayesian models, which allows the combination of subjective ideas with the data collected, update the prior information and improve inferences. Drastic growth of Bayesian applications indicates it becomes more and more popular, because the advent of computational methods (e.g., MCMC) renders sophisticated analysis. In Bayesian framework, the flexibility and generality allows it to cope with very complex problems.
One big obstacle in earlier Bayesian analysis is how to sample from the usually complex posterior distribution. With modern techniques and fast-developed computation capacity, we now have tools to solve this problem.
We discuss Acceptance-Rejection sampling, importance sampling and then the MCMC methods. Metropolis-Hasting algorithm, as a very versatile, efficient and powerful simulation technique to construct a Markov Chain, borrows the idea from the well-known acceptance-rejection sampling to generate candidates that are either accepted or rejected, but then retains the current values when rejection takes place (1). A special case of Metropolis-Hasting algorithm is Gibbs Sampler. When dealing with high dimensional problems, Gibbs Sampler doesn’t require a decent proposal distribution. It generates the Markov Chain through univariate conditional probability distribution, which greatly simplifies problems. We illustrate the use of those approaches with examples (with R codes) to provide a thorough review.
Those basic methods have variants to deal with different situations. And they are building blocks for more advanced problems.
This report is not a tutorial for statistics or the software R. The author assumes that readers are familiar with basic statistical concepts and common R statements. If needed, a detailed instruction of R programming can be found in the Comprehensive R Archive Network (CRAN): http://cran.R-project.org / text
|
2 |
A Bayesian approach to the job search model and its application to unemployment durations using MCMC methodsWalker, Neil Rawlinson January 1999 (has links)
No description available.
|
3 |
Parallel simulation, delayed rejection and reversible jump MCMC for object recognitionHarkness, Miles Adam January 2000 (has links)
No description available.
|
4 |
Modelling ordinal categorical data : a Gibbs sampler approachPang, Wan-Kai January 2000 (has links)
No description available.
|
5 |
Econometric analysis of limited dependent time seriesManrique Garcia, Aurora January 1997 (has links)
No description available.
|
6 |
Bayesian inference for non-Gaussian state space model using simulationPitt, Michael K. January 1997 (has links)
No description available.
|
7 |
New methods for mode jumping in Markov chain Monte Carlo algorithmsIbrahim, Adriana Irawati Nur January 2009 (has links)
Standard Markov chain Monte Carlo (MCMC) sampling methods can experience problem sampling from multi-modal distributions. A variety of sampling methods have been introduced to overcome this problem. The mode jumping method of Tjelmeland & Hegstad (2001) tries to find a mode and propose a value from that mode in each mode jumping attempt. This approach is inefficient in that the work needed to find each mode and model the distribution in a neighbourhood of the mode is carried out repeatedly during the sampling process. We shall propose a new mode jumping approach which retains features of the Tjelmeland & Hegstad (2001) method but differs in that it finds the modes in an initial search, then uses this information to jump between modes effectively in the sampling run. Although this approach does not allow a second chance to find modes in the sampling run, we can show that the overall probability of missing a mode in our approach is still low. We apply our methods to sample from distributions which have continuous variables, discrete variables, a mixture of discrete and continuous variables and variable dimension. We show that our methods work well in each case and in general, are better than the MCMC sampling methods commonly used in these cases and also, are better than the Tjelmeland & Hegstad (2001) method in particular.
|
8 |
A search for hep neutrinos with the Sudbury Neutrino ObservatoryHoward, Christopher William 11 1900 (has links)
This thesis focuses on the search for neutrinos from the solar hep reaction using the combined three phases of the Sudbury Neutrino Observatory (SNO) data. The data were taken over the years 19992006, totalling 1,083 days of live neutrino time.
The previous published SNO hep neutrino search was completed in 2001 and only included the first phase of data taking. That hep search used an event counting approach in one energy bin with no energy spectral information included. This thesis will use a spectral analysis approach.
The hep neutrino search will be a Bayesian analysis using Markov Chain Monte Carlo (MCMC), and a Metropolis-Hastings algorithm to sample the likelihood space. The method allows us to determine the best fit values for the parameters. This signal extraction will measure the 8B flux, the atmospheric neutrino background rate in the SNO detector, and the hep flux.
This thesis describes the tests used to verify the MCMC algorithm and signal extraction. It defines the systematic uncertainties and how they were accounted for in the fit. It also shows the correlations between all of the parameters and the effect of each systematic uncertainty on the result.
The three phase hep signal extraction was completed using only 1/3 of the
full data set. With these lowered statistics, this analysis was able to place an
upper limit on the hep flux of 4.2 10^4 cm2 s1 with a 90% confidence limit.
It was able to measure a hep flux of (2.40(+1.19)(-1.60))10^4 cm2 s1. These numbers can be compared with the previous SNO upper limit of 2.310^4 cm2 s1 with a 90% confidence limit, and the standard solar model prediction of (7.970 1.236) 10^3 cm2 s1.
|
9 |
A search for hep neutrinos with the Sudbury Neutrino ObservatoryHoward, Christopher William Unknown Date
No description available.
|
10 |
A Combined Three-Phase Signal Extraction of the Sudbury Neutrino Observatory Data Using Markov Chain Monte Carlo TechniqueHabib, Shahnoor Unknown Date
No description available.
|
Page generated in 0.0806 seconds