Because of the accuracy and ease of implementation, the Monte Carlo methodology is widely used in the analysis of nuclear systems. The estimated effective multiplication factor (keff) and flux distribution are statistical by their natures. In eigenvalue problems, however, neutron histories are not independent but are correlated through subsequent generations. Therefore, it is necessary to ensure that only the converged data are used for further analysis. Discarding a larger amount of initial histories would reduce the risk of contaminating the results by non-converged data, but increase the computational expense. This issue is amplified for large nuclear systems with slow convergence. One solution would be to use the convergence of keff or the flux distribution as the criterion for initiating accumulation of data. Although several approaches have been developed aimed at identifying convergence, these methods are not always reliable, especially for slow converging problems. This dissertation has attacked this difficulty by developing two independent but related methodologies. One aims to find a more reliable and robust way to assess convergence by statistically analyzing the local flux change. The other forms a basis to increase the convergence rate and thus reduce the computational expense. Eventually, these two topics will contribute to the ultimate goal of improving the reliability and efficiency of the Monte Carlo criticality calculations.
Identifer | oai:union.ndltd.org:GATECH/oai:smartech.gatech.edu:1853/43738 |
Date | 12 December 2011 |
Creators | Shi, Bo |
Publisher | Georgia Institute of Technology |
Source Sets | Georgia Tech Electronic Thesis and Dissertation Archive |
Detected Language | English |
Type | Dissertation |
Page generated in 0.0127 seconds