Spelling suggestions: "subject:"deterministic"" "subject:"eterministic""
1 |
A polynomial time algorithm for prime recognitionDomingues, Riaal 21 August 2007 (has links)
Prime numbers are of the utmost importance in many applications and in particular cryptography. Firstly, number theory background is introduced in order to present the non-deterministic Solovay-Strassen primality test. Sec- ondly, the deterministic primality test discovered by Agrawal, Kayal and Sax- ena in 2002 is presented with the proofs following their original paper. Lastly, a remark will be made about the practical application of the deterministic algorithm versus using the non-deterministic algorithms in applications. / Dissertation (MSc (Applied Mathematics))--University of Pretoria, 2007. / Mathematics and Applied Mathematics / MSc / unrestricted
|
2 |
Variability and Uncertainty in Risk Estimation for Brownfields RedevelopmentOlsen, Arne E. 31 July 2001 (has links)
Various methods can be used to estimate the human health risk associated with exposure to contaminants at brownfields facilities. The deterministic method has been the standard practice, but the use of probabilistic methods is increasing. Contaminant data for non-carcinogens and carcinogens from 21 brownfields sites in Pennsylvania were collected and compiled. These were used to evaluate the performance of the deterministic and several probabilistic methods for assessing exposure and risk in relation to variability and uncertainty in the data set and input parameters. The probabilistic methods were based (a) entirely on Monte Carlo simulated input parameter distribution functions, (b) on a combination of some of these functions and fixed parameter values, or (c) on a parameter distribution function. These methods were used to generate contaminant intake doses, defined as the 90th, 95th, or 99.9th percentile of their estimated output distribution, for the principal human exposure routes. These values were then compared with the corresponding point values estimated by the deterministic method. For all exposure routes the probabilistic intake dose estimates, taken as the 90th and 95th percentiles of the output distribution, were not markedly different from the deterministic values or from each other. The opposite was generally the case for the estimates at the 99.9th cutoff percentile; especially for the Monte Carlo-based methods. Increasing standard deviation of the input contaminant concentration tended to produce higher intake dose estimates for all estimation methods. In pairwise comparisons with the deterministic estimates, this trend differed significantly only for the probabilistic intake doses estimated as the 99.9th percentile of the output distribution. Taken together, these results did not indicate clear and definitive advantages in using probabilistic methods over the deterministic method for exposure and risk assessment for brownfields redevelopment. They supported using the tired system for environmental risk assessment at any particular brownfields facility. / Master of Science
|
3 |
Fractal diffusion coefficients in simple dynamical systemsKnight, Georgie Samuel January 2012 (has links)
Deterministic diffusion is studied in simple, parameter-dependent dynamical systems. The diffusion coefficient is often a fractal function of the control parameter, exhibiting regions of scaling and self-similarity. Firstly, the concepts of chaos and deterministic diffusion are introduced in the context of dynamical systems. The link between deterministic diffusion and physical diffusion is made via random walk theory. Secondly, parameter-dependent diffusion coefficients are analytically derived by solving the Taylor-Green-Kubo formula. This is done via a recursion relation solution of fractal 'generalised Takagi functions'. This method is applied to simple one-dimensional maps and for the first time worked out fully analytically. The fractal parameter dependence of the diffusion coefficient is explained via Markov partitions. Linear parameter dependence is observed which in some cases is due to ergodicity breaking. However, other cases are due to a previously unobserved phenomenon called the 'dominating-branch' effect. A numerical investigation of the two-dimensional 'sawtooth map' yields evidence for a possible fractal structure. Thirdly, a study of different techniques for approximating the diffusion coefficient of a parameter-dependent dynamical system is then performed. The practicability of these methods, as well as their capability in exposing a fractal structure is compared. Fourthly, an analytical investigation into the dependence of the diffusion coefficient on the size and position of the escape holes is then undertaken. It is shown that varying the position has a strong effect on diffusion, whilst the asymptotic regime of small-hole size is dependent on the limiting behaviour of the escape holes. Finally, an exploration of a method which involves evaluating the zeros of a system's dynamical zeta function via the weighted Milnor-Thurston kneading determinant is performed. It is shown how to relate the diffusion coefficient to a zero of the dynamical zeta function before analytically deriving the diffusion coefficient via the kneading determinant.
|
4 |
A Novel Methodology for Iterative Image Reconstruction in SPECT Using Deterministic Particle TransportRoyston, Katherine 30 April 2015 (has links)
Single photon emission computed tomography (SPECT) is used in a variety of medical procedures, including myocardial perfusion, bone metabolism, and thyroid function studies. In SPECT, the emissions of a radionuclide within a patient are counted at a gamma camera to form a 2-dimensional projection of the 3-dimensional radionuclide distribution within the patient. This unknown 3-dimensional source distribution can be reconstructed from many 2-dimensional projections obtained at different angles around the patient. This reconstruction can be improved by properly modeling the physics in the patient, i.e., particle absorption and scattering. Currently, such modeling is done using statistical Monte Carlo methods, but deterministic codes have the potential to offer fast computation speeds while fully modeling particle interactions within the patient. Deterministic codes are not susceptible to statistical uncertainty, but have been over-looked for applications to nuclear medicine, most likely due to their own limitations, including discretization and large memory requirements.
A novel deterministic reconstruction methodology for SPECT (DRS) has been developed to apply the advantages of deterministic algorithms to SPECT iterative image reconstruction. Using a maximum likelihood expectation maximization (ML-EM) algorithm, a deterministic code can fully model particle transport in the patient in the forward projection step, without the need of a large system matrix. The TITAN deterministic transport code has a SPECT formulation that allows for fast simulation of SPECT projection images and has been benchmarked through comparison with results from the SIMIND and MCNP5 Monte Carlo codes in this dissertation. The TITAN SPECT formulation has been improved through a modified collimator representation and full parallelization. The DRS methodology has been implemented in the TITAN code to create TITAN with Image Reconstruction (TITAN-IR). The TITAN-IR code has been used to successfully reconstruct the source distribution from SPECT data for the Jaszczak and NCAT phantoms. Extensive studies have been conducted to examine the sensitivity of TITAN-IR image quality to deterministic parameter selection as well as collimator blur and noise in the projection data being reconstructed. The TITAN-IR reconstruction has also been compared with other reconstruction algorithms. This novel image reconstruction methodology has been shown to reconstruct images in short computation times, demonstrating its potential in a clinical setting with further development. / Ph. D.
|
5 |
On models for multi-user gaussian channels with fadingEl Haddad, Rony 03 September 2009 (has links)
An analytically tractable model for Gaussian multiuser channels with fading is studied, and the capacity region of this model is found to be a good approximation of the capacity region of the original Gaussian network. This
work extends the existing body of work on deterministic models for Gaussian multiuser channels to include the physical phenomenon of fading. In particular, it generalizes these results to a unicast, multiple node network setting with fading. / text
|
6 |
Mathematical frameworks for the transmission dynamics of HIV on a concurrent partnership networkParker, Christopher Gareth January 1996 (has links)
No description available.
|
7 |
Combining measurements with deterministic model outputs: predicting ground-level ozoneLiu, Zhong 05 1900 (has links)
The main topic of this thesis is how to combine model outputs from deterministic models with measurements from monitoring stations for air pollutants or other meteorological variables. We consider two different approaches to address this particular problem.
The first approach is by using the Bayesian Melding (BM) model proposed by Fuentes and Raftery (2005). We successfully implement this model and conduct several simulation studies to examine the performance of this model in different scenarios. We also apply the melding model to the ozone data to show the importance of using the Bayesian melding model to calibrate the model outputs. That is, to adjust the model outputs for the prediction of measurements. Due to the Bayesian framework of the melding model, we can extend it to incorporate other components such as ensemble models, reversible jump MCMC for variable selection.
However, the BM model is purely a spatial model and we generally have to deal with space-time dataset in practice. The deficiency of the BM approach leads us to a second approach, an alternative to the BM model, which is a linear mixed model (different from most linear mixed models, the random effects being spatially correlated) with temporally and spatially correlated residuals. We assume the spatial and temporal correlation are separable and use an AR process to model the temporal correlation. We also develop a multivariate version of this model.
Both the melding model and linear mixed model are Bayesian hierarchical models, which can better estimate the uncertainties of the estimates and predictions.
|
8 |
Combining measurements with deterministic model outputs: predicting ground-level ozoneLiu, Zhong 05 1900 (has links)
The main topic of this thesis is how to combine model outputs from deterministic models with measurements from monitoring stations for air pollutants or other meteorological variables. We consider two different approaches to address this particular problem.
The first approach is by using the Bayesian Melding (BM) model proposed by Fuentes and Raftery (2005). We successfully implement this model and conduct several simulation studies to examine the performance of this model in different scenarios. We also apply the melding model to the ozone data to show the importance of using the Bayesian melding model to calibrate the model outputs. That is, to adjust the model outputs for the prediction of measurements. Due to the Bayesian framework of the melding model, we can extend it to incorporate other components such as ensemble models, reversible jump MCMC for variable selection.
However, the BM model is purely a spatial model and we generally have to deal with space-time dataset in practice. The deficiency of the BM approach leads us to a second approach, an alternative to the BM model, which is a linear mixed model (different from most linear mixed models, the random effects being spatially correlated) with temporally and spatially correlated residuals. We assume the spatial and temporal correlation are separable and use an AR process to model the temporal correlation. We also develop a multivariate version of this model.
Both the melding model and linear mixed model are Bayesian hierarchical models, which can better estimate the uncertainties of the estimates and predictions.
|
9 |
Role of chance and history during evolution in Chlamydomonas reinhardtiiLachapelle, Josianne Lyse January 2016 (has links)
The extent to which evolution is repeatable has important implications. If evolution is highly repeatable, the trajectories and outcomes of evolution in different lineages will always be the same. On the other hand, if evolution is not repeatable, then trajectories and outcomes will be diverse. Thus, the repeatability of evolution affects our understanding of the nature of biodiversity and can inform the extent to which evolutionary theory can be used to make predictions. The repeatability of evolution depends on the relative contribution of selection, chance, and history. To determine what factors affect the importance of chance and history during evolution, I propagated replicated populations of the unicellular green alga Chlamydomonas reinhardtii in controlled environments. I measured the change in fitness after a few hundred generations and determined how much variation had arisen among replicate populations and among populations with different histories. I applied a similar approach to study the importance of history in extinctions, and measured rates of extinction in populations with different histories. I found that evolution is much less repeatable in small than in large populations because history is more constraining and selection less efficient in small than in large populations. There is also a significant effect of sex and recombination on the repeatability of evolution at the fitness level, but this effect is highly dependent on the environment of selection. Sex can increase the importance of chance or history in some environments, but lower their importance in others, thereby leading to convergence or divergence depending on the environment. Thirdly, I found that the importance of history during evolution does not appear to come from the accumulation of past evolutionary selection pressures, but rather comes from only the most recent selection pressure as it determines genetic correlations for growth between different environments and the amount of genetic variance. Finally, I found that extinction risks are extremely high during continuous environmental deterioration, although a history of sexual reproduction and phenotypic plasticity play an important role in adaptation. By focusing not solely on the effect of treatments on mean trait values, but also on the variance that arises in our evolution experiments, we can gain a better understanding of the contribution that chance and history make to evolution. The repeatability of evolution can therefore inform us about the adaptive vs. stochastic nature of the diversity we see today, and about the specificity or generality of evolutionary outcomes.
|
10 |
Combining measurements with deterministic model outputs: predicting ground-level ozoneLiu, Zhong 05 1900 (has links)
The main topic of this thesis is how to combine model outputs from deterministic models with measurements from monitoring stations for air pollutants or other meteorological variables. We consider two different approaches to address this particular problem.
The first approach is by using the Bayesian Melding (BM) model proposed by Fuentes and Raftery (2005). We successfully implement this model and conduct several simulation studies to examine the performance of this model in different scenarios. We also apply the melding model to the ozone data to show the importance of using the Bayesian melding model to calibrate the model outputs. That is, to adjust the model outputs for the prediction of measurements. Due to the Bayesian framework of the melding model, we can extend it to incorporate other components such as ensemble models, reversible jump MCMC for variable selection.
However, the BM model is purely a spatial model and we generally have to deal with space-time dataset in practice. The deficiency of the BM approach leads us to a second approach, an alternative to the BM model, which is a linear mixed model (different from most linear mixed models, the random effects being spatially correlated) with temporally and spatially correlated residuals. We assume the spatial and temporal correlation are separable and use an AR process to model the temporal correlation. We also develop a multivariate version of this model.
Both the melding model and linear mixed model are Bayesian hierarchical models, which can better estimate the uncertainties of the estimates and predictions. / Science, Faculty of / Statistics, Department of / Graduate
|
Page generated in 0.0528 seconds