• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 83
  • 18
  • 13
  • 3
  • 3
  • 3
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 148
  • 148
  • 148
  • 30
  • 25
  • 23
  • 20
  • 20
  • 19
  • 19
  • 18
  • 16
  • 15
  • 15
  • 15
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Pricing Options with Monte Carlo and Binomial Tree Methods

Sun, Xihao 03 May 2011 (has links)
This report describes our work in pricing options using computational methods. First, I collected the historical asset prices for assets in four economic sectors to estimate model parameters, such as asset returns and covariances. Then I used these parameters to model asset prices using multiple geometric Brownian motion and simulate new asset prices. Using the generated prices, I used Monte Carlo methods and control variates to price call options. Next I used the binomial tree model to price put options, which I was introduced to in the course Math 571: Financial Mathematics I. Using the estimated put and call option prices together with some stocks, I formed a portfolio in an Interactive Brokers paper account . This project was done a part of the masters capstone course Math 573: Computational Methods of Financial Mathematics.
2

High accuracy correlated wavefunctions

Harrison, R. J. January 1984 (has links)
No description available.
3

Efficient simulation techniques for biochemical reaction networks

Lester, Christopher January 2017 (has links)
Discrete-state, continuous-time Markov models are becoming commonplace in the modelling of biochemical processes. The mathematical formulations that such models lead to are opaque, and, due to their complexity, are often considered analytically intractable. As such, a variety of Monte Carlo simulation algorithms have been developed to explore model dynamics empirically. Whilst well-known methods, such as the Gillespie Algorithm, can be implemented to investigate a given model, the computational demands of traditional simulation techniques remain a significant barrier to modern research. In order to further develop and explore biologically relevant stochastic models, new and efficient computational methods are required. In this thesis, high-performance simulation algorithms are developed to estimate summary statistics that characterise a chosen reaction network. The algorithms make use of variance reduction techniques, which exploit statistical properties of the model dynamics, so that the statistics can be computed efficiently. The multi-level method is an example of a variance reduction technique. The method estimates summary statistics of well-mixed, spatially homogeneous models by using estimates from multiple ensembles of sample paths of different accuracies. In this thesis, the multi-level method is developed in three directions: firstly, a nuanced implementation framework is described; secondly, a reformulated method is applied to stiff reaction systems; and, finally, different approaches to variance reduction are implemented and compared. The variance reduction methods that underpin the multi-level method are then re-purposed to understand how the dynamics of a spatially-extended Markov model are affected by changes in its input parameters. By exploiting the inherent dynamics of spatially-extended models, an efficient finite difference scheme is used to estimate parametric sensitivities robustly. The new simulation methods are tested for functionality and efficiency with a range of illustrative examples. The thesis concludes with a discussion of our findings, and a number of future research directions are proposed.
4

Econometric analysis of limited dependent time series

Manrique Garcia, Aurora January 1997 (has links)
No description available.
5

Design and evaluation of a Monte Carlo model of a low-cost kilovoltage x-ray arc therapy system

Breitkreutz, Dylan Yamabe 28 June 2019 (has links)
There is a growing global need for proper access to radiation therapy. This need exists predominantly in low- and middle-income countries but exists in some high-income countries as well. The solution to this problem is complex and requires changes in government policy, education and technology. The objective of the work contained in this dissertation is the development of a novel external beam radiation therapy system capable of treating a variety of cancers. The intent of this system is to provide a cost-effective radiation therapy system, which can primarily be utilized in low- and middle-income countries. This new system uses kilovoltage rather than megavoltage x-rays and is therefore much more cost-effective. The ultimate purpose of this kilovoltage radiation therapy system is to improve access to radiation therapy worldwide by supplementing current radiation therapy technology. As a first step, the kilovoltage x-ray arc therapy or KVAT system was modeled using the EGSnrc BEAMnrc and DOSXYZnrc Monte Carlo software tools. For this initial study 200 kV arc-therapy was simulated on cylindrical water phantoms of two sizes, each of which contained a variety of planning target volume (PTV) sizes and locations. Additionally, prone and supine partial breast irradiation treatment plans were generated using KVAT. The objective of this work was to determine whether or not skin-sparing could be achieved using the KVAT system while also delivering a clinically relevant dose rate to the PTV. The results of the study indicated that skin-sparing is indeed achievable and that the quality of KVAT treatment plans improves for full 360-degree arcs and smaller PTV sizes. The second step of this project involved the Monte Carlo simulation of KVAT treatment plans for breast, lung and prostate cancer. Spherical PTVs of 3-cm diameter were used for the breast and lung treatment plans while a 4-cm diameter PTV was used for prostate. Additionally, inverse optimization was utilized to make full use of the non-conformal irradiation geometry of KVAT. As a means of comparison, megavoltage treatment plans that could be delivered by a clinical linear accelerator were generated for each patient as well. In order to evaluate the safety of KVAT treatment plans, dose constraints were taken from published Radiation Therapy Oncology Group (RTOG) reports. The results of this study indicated that the 200 kV breast and 225 kV lung KVAT treatment plans were within dose constraints and could be delivered in a reasonable length of time. The 225 kV prostate treatment plan, while technically within dose constraints, delivered a large dose to non-critical healthy tissues due to the limited number of beam angles that did not pass through boney anatomy. It was concluded that plans such as prostate with large volumes of bone present might not be feasible for KVAT treatment. The third step aimed to expand upon previous work and simulated more realistic KVAT treatment plans by using PTV volumes contoured by radiation oncologists. Additionally, this study used a completely redesigned KVAT geometry, which employed a stationary reflection anode and a new collimator design. The design modeled in this study was based upon the specifications of the prototype system under construction by PrecisionRT, a commercial partner. Three stereotactic ablative radiotherapy (SABR) lung patients were selected that had received treatment at the Vancouver Island Cancer Centre. In order to fully cover the PTVs of each patient, spherical sub-volumes were placed within the clinically contoured PTV of each patient. Dose constraints for at-risk organs were taken from an RTOG report on stereotactic body radiation therapy and were used to inversely optimize the 200 kV KVAT treatment plans. The calculated KVAT plans were compared with the clinical 6 MV SABR plans delivered to each patient. The results of this study indicated that KVAT lung plans were within dose constraints for all three patients with the exception of the ribs in the second patient who had a tumor directly adjacent to the rib cage. The fourth and last step of this project was the experimental validation of a simple, proof-of-principle KVAT system. Simple geometric methods were used to design a collimator consisting of two slabs of brass separated by ~6 cm, each with 5 apertures, which would create an array of 5 converging beamlets. The collimator was used with a tabletop x-ray tube system. A rectangular solid water phantom and cylindrical TIVAR 1000 phantom were placed on a rotation stage and irradiated using 360-degree arcs. EBT3 gafchromic film was placed in each phantom to measure two-dimensional dose distributions. Film dose distributions were analyzed and compared to Monte Carlo generated dose distributions. Both the rectangular solid water phantom and cylindrical TIVAR phantom showed skin-sparing effects in their dose distributions. The highest degree of skin-sparing was achieved in the larger, 20 cm diameter cylindrical phantom. Furthermore, the measured film data and calculated metrics of the rectangular phantom were within 10% of the MC calculated values for two out of three films. The discrepancy in the third film can be explained by errors in the experimental setup. In conclusion, the work contained in this dissertation has established the feasibility of a cost-effective kilovoltage arc-therapy system designed to treat deep-seated lesions by means of Monte Carlo simulations and experimental dosimetry. The studies performed so far suggest that KVAT is most suitable for smaller lesions in patient anatomy that does not involve large amounts of boney anatomy. Perhaps most importantly, an experimental study has demonstrated the skin-sparing ability of a simple KVAT prototype. / Graduate / 2020-07-10
6

Monte Carlo Simulation of Optical Coherence Tomography of Media with Arbitrary Spatial Distributions

Malektaji, Siavash 02 September 2014 (has links)
Optical Coherence Tomography (OCT) is a sub-surface imaging modality with growing number of applications. An accurate and practical OCT simulator could be an important tool to understand the physics underlying OCT and to design OCT systems with improved performance. All available OCT simulators are restricted to imaging planar multilayered media or non-planar multilayered media. In this work I developed a novel Monte Carlo based simulator of OCT imaging for turbid media with arbitrary spatial distributions. This simulator allows computation of both Class I diffusive reflectance, due to ballistic and quasi-ballistic scattered photons, and Class II diffusive reflectance due to multiple scattered photons. A tetrahedron-based mesh is used to model any arbitrary-shaped medium to be simulated. I have also implemented a known importance sampling method to significantly reduce computational time of simulations by up to two orders of magnitude. The simulator is verified by comparing its results to results from previously validated OCT simulators for multilayered media. I present sample simulation results for OCT imaging of non-layered media which would not have been possible with earlier simulators.
7

On large deviations and design of efficient importance sampling algorithms

Nyquist, Pierre January 2014 (has links)
This thesis consists of four papers, presented in Chapters 2-5, on the topics large deviations and stochastic simulation, particularly importance sampling. The four papers make theoretical contributions to the development of a new approach for analyzing efficiency of importance sampling algorithms by means of large deviation theory, and to the design of efficient algorithms using the subsolution approach developed by Dupuis and Wang (2007). In the first two papers of the thesis, the random output of an importance sampling algorithm is viewed as a sequence of weighted empirical measures and weighted empirical processes, respectively. The main theoretical results are a Laplace principle for the weighted empirical measures (Paper 1) and a moderate deviation result for the weighted empirical processes (Paper 2). The Laplace principle for weighted empirical measures is used to propose an alternative measure of efficiency based on the associated rate function.The moderate deviation result for weighted empirical processes is an extension of what can be seen as the empirical process version of Sanov's theorem. Together with a delta method for large deviations, established by Gao and Zhao (2011), we show moderate deviation results for importance sampling estimators of the risk measures Value-at-Risk and Expected Shortfall. The final two papers of the thesis are concerned with the design of efficient importance sampling algorithms using subsolutions of partial differential equations of Hamilton-Jacobi type (the subsolution approach). In Paper 3 we show a min-max representation of viscosity solutions of Hamilton-Jacobi equations. In particular, the representation suggests a general approach for constructing subsolutions to equations associated with terminal value problems and exit problems. Since the design of efficient importance sampling algorithms is connected to such subsolutions, the min-max representation facilitates the construction of efficient algorithms. In Paper 4 we consider the problem of constructing efficient importance sampling algorithms for a certain type of Markovian intensity model for credit risk. The min-max representation of Paper 3 is used to construct subsolutions to the associated Hamilton-Jacobi equation and the corresponding importance sampling algorithms are investigated both theoretically and numerically. The thesis begins with an informal discussion of stochastic simulation, followed by brief mathematical introductions to large deviations and importance sampling. / <p>QC 20140424</p>
8

Applying MCMC methods to multi-level models

Browne, William J. January 1998 (has links)
No description available.
9

Automating inference, learning, and design using probabilistic programming

Rainforth, Thomas William Gamlen January 2017 (has links)
Imagine a world where computational simulations can be inverted as easily as running them forwards, where data can be used to refine models automatically, and where the only expertise one needs to carry out powerful statistical analysis is a basic proficiency in scientific coding. Creating such a world is the ambitious long-term aim of probabilistic programming. The bottleneck for improving the probabilistic models, or simulators, used throughout the quantitative sciences, is often not an ability to devise better models conceptually, but a lack of expertise, time, or resources to realize such innovations. Probabilistic programming systems (PPSs) help alleviate this bottleneck by providing an expressive and accessible modeling framework, then automating the required computation to draw inferences from the model, for example finding the model parameters likely to give rise to a certain output. By decoupling model specification and inference, PPSs streamline the process of developing and drawing inferences from new models, while opening up powerful statistical methods to non-experts. Many systems further provide the flexibility to write new and exciting models which would be hard, or even impossible, to convey using conventional statistical frameworks. The central goal of this thesis is to improve and extend PPSs. In particular, we will make advancements to the underlying inference engines and increase the range of problems which can be tackled. For example, we will extend PPSs to a mixed inference-optimization framework, thereby providing automation of tasks such as model learning and engineering design. Meanwhile, we make inroads into constructing systems for automating adaptive sequential design problems, providing potential applications across the sciences. Furthermore, the contributions of the work reach far beyond probabilistic programming, as achieving our goal will require us to make advancements in a number of related fields such as particle Markov chain Monte Carlo methods, Bayesian optimization, and Monte Carlo fundamentals.
10

Projector Quantum Monte Carlo methods for linear and non-linear wavefunction ansatzes

Schwarz, Lauretta Rebecca January 2017 (has links)
This thesis is concerned with the development of a Projector Quantum Monte Carlo method for non-linear wavefunction ansatzes and its application to strongly correlated materials. This new approach is partially inspired by a prior application of the Full Configuration Interaction Quantum Monte Carlo (FCIQMC) method to the three-band (p-d) Hubbard model. Through repeated stochastic application of a projector FCIQMC projects out a stochastic description of the Full Configuration Interaction (FCI) ground state wavefunction, a linear combination of Slater determinants spanning the full Hilbert space. The study of the p-d Hubbard model demonstrates that the nature of this FCI expansion is profoundly affected by the choice of single-particle basis. In a counterintuitive manner, the effectiveness of a one-particle basis to produce a sparse, compact and rapidly converging FCI expansion is not necessarily paralleled by its ability to describe the physics of the system within a single determinant. The results suggest that with an appropriate basis, single-reference quantum chemical approaches may be able to describe many-body wavefunctions of strongly correlated materials. Furthermore, this thesis presents a reformulation of the projected imaginary time evolution of FCIQMC as a Lagrangian minimisation. This naturally allows for the optimisation of polynomial complex wavefunction ansatzes with a polynomial rather than exponential scaling with system size. The proposed approach blurs the line between traditional Variational and Projector Quantum Monte Carlo approaches whilst involving developments from the field of deep-learning neural networks which can be expressed as a modification of the projector. The ability of the developed approach to sample and optimise arbitrary non-linear wavefunctions is demonstrated with several classes of Tensor Network States all of which involve controlled approximations but still retain systematic improvability towards exactness. Thus, by applying the method to strongly-correlated Hubbard models, as well as ab-initio systems, including a fully periodic ab-initio graphene sheet, many-body wavefunctions and their one- and two-body static properties are obtained. The proposed approach can handle and simultaneously optimise large numbers of variational parameters, greatly exceeding those of alternative Variational Monte Carlo approaches.

Page generated in 0.1086 seconds