321 |
Pricing barrier options with numerical methods / Candice Natasha de PonteDe Ponte, Candice Natasha January 2013 (has links)
Barrier options are becoming more popular, mainly due to the reduced cost to hold a
barrier option when compared to holding a standard call/put options, but exotic options
are difficult to price since the payoff functions depend on the whole path of the underlying
process, rather than on its value at a specific time instant.
It is a path dependent option, which implies that the payoff depends on the path followed by
the price of the underlying asset, meaning that barrier options prices are especially sensitive
to volatility.
For basic exchange traded options, analytical prices, based on the Black-Scholes formula,
can be computed. These prices are influenced by supply and demand. There is not always
an analytical solution for an exotic option. Hence it is advantageous to have methods that
efficiently provide accurate numerical solutions. This study gives a literature overview and
compares implementation of some available numerical methods applied to barrier options.
The three numerical methods that will be adapted and compared for the pricing of barrier
options are: • Binomial Tree Methods • Monte-Carlo Methods • Finite Difference Methods / Thesis (MSc (Applied Mathematics))--North-West University, Potchefstroom Campus, 2013
|
322 |
Premiepensionens Marknadsrisk : En Monte Carlo-simulering av den allmänna pensionenSverresson, Carl-Petter, Östling, Christoffer January 2014 (has links)
A reforming trend is captured showing that countries are shifting from defined benefit pension systems towards defined contribution systems. The reforms have been justified through predictions that the defined benefit systems will not manage to provide good enough pensions to members in the future. The newer defined contribution pension plans often include individual financial accounts where individuals have the possibility to choose how a part of their pension savings should be invested. Sweden was early to introduce such a system, which at the moment provides more than 800 funds to choose from. The aim of this thesis is to capture the market risk associated with these individual investments and does so by using Monte Carlo simulations for six selected pension funds. The method produces forecasts of replacement ratios, pension as percentage of pre-retirement income, for two hypothetical individuals: one who starts to work right after elementary school and one individual who starts a five year education and after graduation starts to work. The results show a slightly lower replacement ratio for the educated individual, which also is associated with a higher probability of ending up with a low replacement ratio. The market risk also varies between the funds, which implies that the funds should be chosen with great care. The study ends with arguments for an increasing paternalism with a carefully considered fund offering, providing fewer funds to choose from than today.
|
323 |
Dosimetric verification of radiation therapy including intensity modulated treatments, using an amorphous-silicon electronic portal imaging deviceChytyk-Praznik, Krista January 2009 (has links)
Radiation therapy is continuously increasing in complexity due to technological innovation in delivery techniques, necessitating thorough dosimetric verification. Comparing accurately predicted portal dose images to measured images obtained during patient treatment can determine if a particular treatment was delivered correctly. The goal of this thesis was to create a method to predict portal dose images that was versatile and accurate enough to use in a clinical setting. All measured images in this work were obtained with an amorphous silicon electronic portal imaging device (a-Si EPID), but the technique is applicable to any planar imager. A detailed, physics-motivated fluence model was developed to characterize fluence exiting the linear accelerator head. The model was further refined using results from Monte Carlo simulations and schematics of the linear accelerator. The fluence incident on the EPID was converted to a portal dose image through a superposition of Monte Carlo-generated, monoenergetic dose kernels specific to the a-Si EPID. Predictions of clinical IMRT fields with no patient present agreed with measured portal dose images within 3% and 3 mm. The dose kernels were applied ignoring the geometrically divergent nature of incident fluence on the EPID. A computational investigation into this parallel dose kernel assumption determined its validity under clinically relevant situations. Introducing a patient or phantom into the beam required the portal image prediction algorithm to account for patient scatter and attenuation. Primary fluence was calculated by attenuating raylines cast through the patient CT dataset, while scatter fluence was determined through the superposition of pre-calculated scatter fluence kernels. Total dose in the EPID was calculated by convolving the total predicted incident fluence with the EPID-specific dose kernels. The algorithm was tested on water slabs with square fields, agreeing with measurement within 3% and 3 mm. The method was then applied to five prostate and six head-and-neck IMRT treatment courses (~1900 clinical images). Deviations between the predicted and measured images were quantified. The portal dose image prediction model developed in this thesis work has been shown to be accurate, and it was demonstrated to be able to verify patients’ delivered radiation treatments.
|
324 |
Quantitative Simulation of Synaptic Vesicle Release at the Neuromuscular JunctionMa, Jun 01 May 2014 (has links)
Nerve signals in the form of action potentials are relayed between neurons through specialized connections called synapses via neurotransmitter released from synaptic vesicles. The release process is Ca2+ dependent, and relies on fusion of neurotransmitter filled synaptic vesicle with the presynaptic membrane. During high frequency stimulation, the amount of vesicle release increases at some synapses (e.g., frog neuromuscular junction (NMJ)), a process known as short-term plasticity. Due to the micron scale size of the presynaptic active zone where vesicle fusion takes place, experimentally study is often difficult. Thus, computational modeling can provide important insight into the mechanism of synaptic vesicle release at active zones. In the first part of my thesis, I used the frog NMJ as a model synapse for computer simulation studies aimed as testing various mechanistic hypotheses proposed to underlie short-term plasticity. Building off a recently reported excess-bindingsite model of synaptic vesicle release at the frog NMJ (Dittrich et al., 2013), I have investigated several mechanisms of short-term facilitation at the frog NMJ. My studies placed constraints on previously proposed mechanistic models, and concluded that the presence of a second calcium sensor protein on synaptic vesicles distinct from synaptotagmin, can explain known properties of facilitation observed at the frog NMJ. In addition, I was able to identify a second facilitation mechanism, which relied on the persistent binding of calcium bound synaptotagmin molecules to lipids of the presynaptic membrane. In the second part of my thesis, I investigated the structure function relationship at active zones, with the hypothesis that active zones are organized from the same basic synaptic building block consisting of a docked vesicle and a small number of closely associated voltage-gated-calcium-channels (VGCCs). To test this hypothesis, I constructed a vesicle release model of the mouse NMJ by reassembling frog NMJ model building blocks based on electron-microscopy imaging data. These two models successfully predicted the functional divergence between frog and mouse NMJ in terms of average vesicle release and short-term plasticity. In the meanwhile, I found that frog NMJ loses facilitation when VGCCs were systematically removed from active zone. By tracking Ca2+ ions from each individual VGCCs, I further show how the difference in short-term plasticity between frog and mouse NMJ may rise from their distinct release building block assemblies. In summary, I have developed a stochastic computer model of synaptic transmission, which not only shed light on the underlying mechanisms of short-term plasticity, but was also proved powerful in understanding structural and functional relationships at synaptic active zones.
|
325 |
Microbeam design in radiobiological researchHollis, Kevin John January 1995 (has links)
Recent work using low-doses of ionising radiations, both in vitro and in ViVO, has suggested that the responses of biological systems in the region of less than 1 Gray may not be predicted by simple extrapolation from the responses at higher doses. Additional experiments, using high-LET radiations at doses of much less than one alpha particle traversal per cell nucleus, have shown responses in a greater number of cells than have received a radiation dose. These findings, and increased concern over the effects of the exposure of the general population to low-levels of background radiation, for example due to radon daughters in the lungs, have stimulated the investigation of the response of mammalian cells to ionising radiations in the extreme low-dose region. In all broad field exposures to particulate radiations at low-dose levels an inherent dose uncertainty exists due to random counting statistics. This dose variation produces a range of values for the measured biological effect within the irradiated population, therefore making the elucidation of the dose-effect relationship extremely difficult. The use of the microbeam irradiation technique will allow the delivery of a controlled number of particles to specific targets within an individual cell with a high degree of accuracy. This approach will considerably reduce the level of variation of biological effect within the irradiated cell population and will allow low-dose responses of cellular systems to be determined. In addition, the proposed high spatial resolution of the microbeam developed will allow the investigation of the distribution of radiation sensitivity within the cell, to provide a better understanding of the mechanisms of radiation action. The target parameters for the microbeam at the Gray Laboratory are a spatial resolution of less than 1 urn and a detection efficiency of better than 99 %. The work of this thesis was to develop a method of collimation, in order to produce a microbeam of 3.5 MeV protons, and to develop a detector to be used in conjunction with the collimation system. In order to determine the optimum design of collimator necessary to produce a proton microbeam, a computer simulation based upon a Monte-Carlo simulation code, written by Dr S J Watts, was developed. This programme was then used to determine the optimum collimator length and the effects of misalignment and divergence of the incident proton beam upon the quality of the collimated beam produced. Designs for silicon collimators were produced, based upon the results of these simulations, and collimators were subsequently produced for us using techniques of micro-manufacturing developed in the semiconductor industry. Other collimator designs were also produced both in-house and commercially, using a range of materials. These collimators were tested to determine both the energy and spatial resolutions of the transmitted proton beam produced. The best results were obtained using 1.6 mm lengths of 1.5 µm diameter bore fused silica tubing. This system produced a collimated beam having a spatial resolution with 90 % of the transmitted beam lying within a diameter of 2.3 ± 0.9 µm and with an energy spectrum having 75 % of the transmitted protons within a Gaussian fit to the full-energy peak. Detection of the transmitted protons was achieved by the use of a scintillation transmission detector mounted over the exit aperture of the collimator. An approximately 10 urn thick ZnS(Ag) crystal was mounted between two 30 urn diameter optical fibres and the light emitted from the crystal transmitted along the fibres to two photomultiplier tubes. The signals from the tubes were analyzed, using coincidence counting techniques, by means of electronics designed by Dr B Vojnovic. The lowest counting inefficiencies obtained using this approach were a false positive count level of 0.8 ± 0.1 % and an uncounted proton level of 0.9 ± 0.3 %. The elements of collimation and detection were then combined in a rugged microbeam assembly, using a fused silica collimator having a bore diameter of 5 urn and a scintillator crystal having a thickness of - 15 µm. The microbeam produced by this initial assembly had a spatial resolution with 90 % of the transmitted protons lying within a diameter of 5.8 ± 1.6 µm, and counting inefficiencies of 0.27 ± 0.22 % and 1.7 ± 0.4 % for the levels of false positive and missed counts respectively. The detector system in this assembly achieves the design parameter of 99 % efficiency, however, the spatial resolution of the beam is not at the desired I urn level. The diameter of the microbeam beam produced is less than the nuclear diameter of many cell lines and so the beam may be used to good effect in the low-dose irradiation of single cells. In order to investigate the variation in sensitivity within a cell the spatial resolution of the beam would require improvement. Proposed methods by which this may be achieved are described.
|
326 |
Metamodel-Based Probabilistic Design for Dynamic Systems with Degrading ComponentsSeecharan, Turuna Saraswati January 2012 (has links)
The probabilistic design of dynamic systems with degrading components is difficult. Design of dynamic systems typically involves the optimization of a time-invariant performance measure, such as Energy, that is estimated using a dynamic response, such as angular speed. The mechanistic models developed to approximate this performance measure are too complicated to be used with simple design calculations and lead to lengthy simulations. When degradation of the components is assumed, in order to determine suitable service times, estimation of the failure probability over the product lifetime is required. Again, complex mechanistic models lead to lengthy lifetime simulations when the Monte Carlo method is used to evaluate probability.
Based on these problems, an efficient methodology is presented for probabilistic design of dynamic systems and to estimate the cumulative distribution function of the time to failure of a performance measure when degradation of the components is assumed. The four main steps include; 1) transforming the dynamic response into a set of static responses at discrete cycle-time steps and using Singular Value Decomposition to efficiently estimate a time-invariant performance measure that is based upon a dynamic response, 2) replacing the mechanistic model with an approximating function, known as a “metamodel” 3) searching for the best design parameters using fast integration methods such as the First Order Reliability Method and 4) building the cumulative distribution function using the summation of the incremental failure probabilities, that are estimated using the set-theory method, over the planned lifetime.
The first step of the methodology uses design of experiments or sampling techniques to select a sample of training sets of the design variables. These training sets are then input to the computer-based simulation of the mechanistic model to produce a matrix of corresponding responses at discrete cycle-times. Although metamodels can be built at each time-specific column of this matrix, this method is slow especially if the number of time steps is large. An efficient alternative uses Singular Value Decomposition to split the response matrix into two matrices containing only design-variable-specific and time-specific information. The second step of the methodology fits metamodels only for the significant columns of the matrix containing the design variable-specific information. Using the time-specific matrix, a metamodel is quickly developed at any cycle-time step or for any time-invariant performance measure such as energy consumed over the cycle-lifetime. In the third step, design variables are treated as random variables and the First Order Reliability Method is used to search for the best design parameters. Finally, the components most likely to degrade are modelled using either a degradation path or a marginal distribution model and, using the First Order Reliability Method or a Monte Carlo Simulation to estimate probability, the cumulative failure probability is plotted. The speed and accuracy of the methodology using three metamodels, the Regression model, Kriging and the Radial Basis Function, is investigated.
This thesis shows that the metamodel offers a significantly faster and accurate alternative to using mechanistic models for both probabilistic design optimization and for estimating the cumulative distribution function. For design using the First-Order Reliability Method to estimate probability, the Regression Model is the fastest and the Radial Basis Function is the slowest. Kriging is shown to be accurate and faster than the Radial Basis Function but its computation time is still slower than the Regression Model. When estimating the cumulative distribution function, metamodels are more than 100 times faster than the mechanistic model and the error is less than ten percent when compared with the mechanistic model. Kriging and the Radial Basis Function are more accurate than the Regression Model and computation time is faster using the Monte Carlo Simulation to estimate probability than using the First-Order Reliability Method.
|
327 |
Estimation Of Expected Monetary Values Of Selected Turkish Oil Fields Using Two Different Risk Assessment MethodsKaya, Egemen Tangut 01 January 2004 (has links) (PDF)
Most investments in the oil and gas industry involve considerable risk with a wide range of potential outcomes for a particular project. However, many economic evaluations are based on the &ldquo / most likely&rdquo / results of variables that could be expected without sufficient consideration given to other possible outcomes and it is well known that initial estimates of all these variables have uncertainty. The data is usually obtained during drilling of the initial oil well and the sources are geophysical (seismic surveys) for formation depths and areal extent of the reservoir trap, well logs for formation tops and bottoms, formation porosity, water saturation and possible permeable strata, core analysis for porosity and saturation data and DST (Drill-Stem Test) for possible oil production rates and samples for PVT (Pressure Volume Temperature) analysis to obtain FVF (Formation Volume Factor) and others. The question is how certain are the values of these variables and what is the probability of these values to occur in the reservoir to evaluate the possible risks. One of the most highly appreciable applications of the risk assessment is the estimation of volumetric reserves of hydrocarbon reservoirs. Monte Carlo and moment technique consider entire ranges of the variables of Original Oil in Place (OOIP) formula rather than deterministic figures. In the present work, predictions were made about how statistical distribution and descriptive statistics of porosity, thickness, area, water saturation, recovery factor, and oil formation volume factor affect the simulated OOIP values. The current work presents the case of two different oil fields in Turkey. It was found that both techniques produce similar results for 95%. The difference between estimated values increases as the percentages decrease from 50% and 5% probability.
|
328 |
Assessment Of Low Temperature Geothermal ResourcesArkan, Serkan 01 January 2003 (has links) (PDF)
One of the most applicable methods of low-temperature geothermal resource
assessment is volumetric method. While applying volumetric method, the values
of uncertain parameters should be determined. An add-in software program to
Microsoft EXCEL, @RISK, is used as a tool to define the uncertainties of the
parameters in volumetric equation. In this study, Monte Carlo simulation
technique is used as the probabilistic approach for the assessment of lowtemperature
Balç / ova-Narlidere geothermal field.
Although Balç / ova-Narlidere geothermal field is being utilized for several direct
heat applications, there exists limited data for resource assessment calculations.
Assessment studies using triangular and uniform distribution type functions for
each parameter gave the mean values of recoverable heat energy of the field as
25.1 MWt and 27.6 MWt, respectively. As optimistic values (90%), those values
were found as 43.6 MWt and 54.3 MWt. While calculating these numbers, a
project life of 25 years with a load factor of 50% is used.
|
329 |
Farm level economics of winter wheat production in the Canadian PrairiesYang, Danyi 11 1900 (has links)
This research project estimated economic costs and benefits of winter wheat production in the Canadian Prairies at a farm level. A combination of Net Present Value analysis and Monte Carlo simulation was used to build cash flow farm models by province and soil zone. The objective of this study was to examine the economic feasibility of winter wheat production on the Prairies. Results show that Prairie farmers will benefit from growing winter wheat if crop research further improves cold tolerance, yield, or quality of winter wheat. Incorporating winter wheat into crop rotations has potential to increase farmers’ wealth in the Canadian Prairies. / Agricultural and Resource Economics
|
330 |
Radiation Dosimetry of Irregularly Shaped ObjectsGriffin, Jonathan Alexander January 2006 (has links)
Electron beam therapy planning and custom electron bolus design were identified as areas in which improvements in equipment and techniques could lead to significant improvements in treatment delivery and patient outcomes. The electron pencil beam algorithms used in conventional Treatment Planning Systems do not accurately model the dose distribution in irregularly shaped objects, near oblique surfaces or in inhomogeneous media. For this reason, at Christchurch Oncology Centre the TPS is not relied on for planning electron beam treatments. This project is an initial study of ways to improve the design of custom electron bolus, the planning of electron beam therapy, and other radiation therapy simulation tasks, by developing a system for the accurate assessment of dose distributions under irregular contours in clinically relevant situations. A shaped water phantom system and a diode array have been developed and tested. The design and construction of this water phantom dosimetry system are described, and its capabilities and limitations discussed. An EGS/BEAM Monte Carlo simulation system has been installed, and models of the Christchurch Oncology Centre linacs in 6MeV and 9MeV electron beam modes have been built and commissioned. A test was run comparing the EGS/BEAM Monte Carlo system and the CMS Xio conventional treatment planning system with the experimental measurement technique using the water phantom and the diode array. This test was successful as a proof of the concept of the experimental technique. At the conclusion of this project, the main limitation of the diode array system was the lack of data processing software. The array produces a large volume of raw data, but not enough processed data was produced during this project to match the spatial resolution of the computer models. An automated data processing system will be needed for clinical use of the array. It has been confirmed that Monte Carlo and pencil-beam algorithms predict significantly different dose distributions for an irregularly shaped object irradiated with megavoltage electron beams. The results from the diode array were consistent with the theoretical models. This project was an initial investigation. At the time of writing, the diode array and the water phantom systems were still at an early stage of development. The work reported here was performed to build, test and commission the equipment. Additional work will be needed to produce an instrument for clinical use. Research into electron beam therapy could be continued, or the equipment used to expand research into new areas.
|
Page generated in 0.0389 seconds