411 |
Smoothed Transformed Density RejectionLeydold, Josef, Hörmann, Wolfgang January 2003 (has links) (PDF)
There are situations in the framework of quasi-Monte Carlo integration where nonuniform low-discrepancy sequences are required. Using the inversion method for this task usually results in the best performance in terms of the integration errors. However, this method requires a fast algorithm for evaluating the inverse of the cumulative distribution function which is often not available. Then a smoothed version of transformed density rejection is a good alternative as it is a fast method and its speed hardly depends on the distribution. It can easily be adjusted such that it is almost as good as the inversion method. For importance sampling it is even better to use the hat distribution as importance distribution directly. Then the resulting algorithm is as good as using the inversion method for the original importance distribution but its generation time is much shorter. / Series: Preprint Series / Department of Applied Statistics and Data Processing
|
412 |
A study of turbulent flame propagationMcNutt, Dinah Georgianna. January 1982 (has links)
Thesis: M.S., Massachusetts Institute of Technology, Department of Mechanical Engineering, 1982 / Includes bibliographical references. / by Dinah Georgianna McNutt. / M.S. / M.S. Massachusetts Institute of Technology, Department of Mechanical Engineering
|
413 |
Applied inverse scatteringMabuza, Boy Raymond 11 1900 (has links)
We are concerned with the quantum inverse scattering problem. The corresponding
Marchenko integral equation is solved by using the collocation method together with
piece-wise polynomials, namely, Hermite splines. The scarcity of experimental data
and the lack of phase information necessitate the generation of the input reflection coefficient by choosing a specific profile and then applying our method to reconstruct it.
Various aspects of the single and coupled channels inverse problem and details about
the numerical techniques employed are discussed.
We proceed to apply our approach to synthetic seismic reflection data. The transformation
of the classical one-dimensional wave equation for elastic displacement into a
Schr¨odinger-like equation is presented. As an application of our method, we consider
the synthetic reflection travel-time data for a layered substrate from which we recover
the seismic impedance of the medium. We also apply our approach to experimental
seismic reflection data collected from a deep water location in the North sea. The
reflectivity sequence and the relevant seismic wavelet are extracted from the seismic
reflection data by applying the statistical estimation procedure known as Markov Chain
Monte Carlo method to the problem of blind deconvolution. In order to implement the
Marchenko inversion method, the pure spike trains have been replaced by amplitudes
having a narrow bell-shaped form to facilitate the numerical solution of the Marchenko
integral equation from which the underlying seismic impedance profile of the medium
is obtained. / Physics / D.Phil.(Physics)
|
414 |
Harmonization of internal quality tasks in analytical laboratories case studies : water analysis methods using polarographic and voltammetric techniquesGumede, Njabulo Joyfull January 2008 (has links)
Dissertation submitted in partial compliance with the requirements of the Masters Degree in Technology: Chemistry, in the Faculty of Applied Sciences at the Durban University of Technology, 2008. / In this work, a holistic approach to validate analytical methods was assessed by virtue of Monte Carlo simulations. This approach involves a statement of the methodsâ s scope (i.e. analytes, matrices and concentration levels) and requisites (internal or external); selection of the methodâ s (fit-for-purpose) features; pre-validation and validation of the intermediate accuracy and its assessment by means of Monte Carlo simulations. Validation of the other methodâ s features and a validity statement in terms of a â fit-for-purposeâ decision making, harmonized validation-control-uncertainty statistics and short-term routine work with the aim of proposing virtually â ready-to-useâ methods. The protocol could be transferred to other methods. The main aim is to harmonize the work to be done by research teams and routine laboratories assuming that different aims, strategies and practical viewpoints exist. As a result, the recommended protocol should be seen as a starting point. It is necessary to propose definitive (harmonized) protocols that must be established by international normalisation/accreditation entities. The Quality Assurance (Method verification and Internal Quality Control, IQC) limits, as well as sample uncertainty were estimated consistently with the validated accuracy statistics i.e. E U (E) and RSDi + U (RSDi). Two case studies were used to assess Monte Carlo simulation as a tool for method validation in analytical laboratories, the first involves an indirect polarographic method for determining nitrate in waste water and the second involves a direct determination of heavy metals in sea water by differential pulse anodic stripping voltammetry, as an example of the application of the protocol. In this sense the uncertainty obtained could be used for decision making purposes as it is very tempting to use uncertainty as a commercial argument and in this work it has been shown that the smaller the uncertainty, the better the measurement of the instrument or the laboratoryâ s reputation.
|
415 |
Model risk for barrier options when priced under different lévy dynamicsMbakwe, Chidinma 12 1900 (has links)
Thesis (MSc)--Stellenbosch University, 2011. / ENGLISH ABSTRACT: Barrier options are options whose payoff depends on whether or not the underlying asset
price hits a certain level - the barrier - during the life of the option. Closed-form solutions
for the prices of these path-dependent options are available in the Black-Scholes
framework. It is well{known, however, that the Black-Scholes model does not price even
the so-called vanilla options correctly. There are a number of popular asset price models
based on exponential Lévy dynamics which are all able to capture the volatility smile, i.e.
reproduce market-observed prices of vanilla options.
This thesis investigates the potential model risk associated with the pricing of barrier
options in several exponential Lévy models. First, the Variance Gamma, Normal Inverse
Gaussian and CGMY models are calibrated to market-observed vanilla option prices. Barrier
option prices are then evaluated in these models using Monte Carlo methods. The
prices obtained are then compared to each other, as well as the Black-Scholes prices. It
is observed that the different exponential Lévy models yield barrier option prices which
are quite close to each other, though quite different from the Black-Scholes prices. This
suggests that the associated model risk is low. / AFRIKAANSE OPSOMMING: Versperring opsies is opsies met 'n afbetaling wat afhanklik is daarvan of die onderliggende
bateprys 'n bepaalde vlak - die versperring - bereik gedurende die lewe van die opsie,
of nie. Formules vir die pryse van sulke opsies is beskikbaar binne die Black-Scholes
raamwerk. Dit is egter welbekend dat die Black-Scholes model nie in staat is om selfs die
sogenaamde vanilla opsies se pryse korrek te bepaal nie. Daar bestaan 'n aantal populêre
bateprysmodelle gebaseer op eksponensiële Lévy-dinamika, wat almal in staat is om die
mark-waarneembare vanilla opsie pryse te herproduseer.
Hierdie tesis ondersoek die potensiële modelrisiko geassosieer met die prysbepaling van
versperring opsies in verskeie eksponseniële Lévy-modelle. Eers word die Variance
Gamma{, Normal Inverse Gaussian- en CGMY-modelle gekalibreer op mark-waarneembare
vanilla opsiepryse. Die pryse van versperring opsies in hierdie modelle word dan bepaal
deur middel van Monte Carlo metodes. Hierdie pryse word dan met mekaar vergelyk,
asook met die Black-Scholespryse. Dit word waargeneem dat die versperring opsiepryse in
die verskillende eksponensiële Lévymodelle redelik na aan mekaar is, maar redelik verskil
van die Black-Scholespryse. Dit suggereer dat die geassosieerde modelrisiko laag is.
|
416 |
A coarse mesh transport method for photons and electrons in 3-DHayward, Robert M. 09 April 2013 (has links)
A hybrid stochastic-deterministic method, COMET-PE, is developed for dose calculation in radiotherapy. Fast, accurate dose calculation is a key component of successful radiotherapy treatment. To calculate dose, COMET-PE solves the coupled Boltzmann Transport Equations for photons and electrons. The method uses a deterministic iteration to compose response functions that are pre-computed using Monte Carlo. Thus, COMET-PE takes advantage of Monte Carlo physics without incurring the computational costs typically required for statistical convergence. This work extends the method to 3-D problems with realistic source distributions. Additionally, the performance of the deterministic solver is improved, taking advantage of both shared-memory and distributed-memory parallelism to enhance efficiency. To verify the method’s accuracy, it is compared with the DOSXYZnrc (Monte Carlo) method using three different benchmark problems: a heterogeneous slab phantom, a water phantom, and a CT-based lung phantom. For the slab phantom, all errors are less than 1.5% of the maximum dose or less than 3% of local dose. For both the water phantom and the lung phantom, over 97% of voxels receiving greater than 10% of the maximum dose pass a 2% (relative error) / 2 mm (distance-to-agreement) test. Timing comparisons show that COMET-PE is roughly 10-30 times faster than DOSXYZnrc. Thus, the new method provides a fast, accurate alternative to Monte Carlo for dose calculation in radiotherapy treatment planning.
|
417 |
Decision forests for computer Go feature learningVan Niekerk, Francois 04 1900 (has links)
Thesis (MSc)--Stellenbosch University, 2014. / ENGLISH ABSTRACT: In computer Go, moves are typically selected with the aid of a tree search
algorithm. Monte-Carlo tree search (MCTS) is currently the dominant algorithm
in computer Go. It has been shown that the inclusion of domain
knowledge in MCTS is able to vastly improve the strength of MCTS engines.
A successful approach to representing domain knowledge in computer Go
is the use of appropriately weighted tactical features and pattern features,
which are comprised of a number of hand-crafted heuristics and a collection
of patterns respectively. However, tactical features are hand-crafted specifically
for Go, and pattern features are Go-specific, making it unclear how
they can be easily transferred to other domains.
As such, this work proposes a new approach to representing domain
knowledge, decision tree features. These features evaluate a state-action
pair by descending a decision tree, with queries recursively partitioning the
state-action pair input space, and returning a weight corresponding to the
partition element represented by the resultant leaf node. In this work, decision
tree features are applied to computer Go, in order to determine their
feasibility in comparison to state-of-the-art use of tactical and pattern features.
In this application of decision tree features, each query in the decision
tree descent path refines information about the board position surrounding
a candidate move.
The results of this work showed that a feature instance with decision tree
features is a feasible alternative to the state-of-the-art use of tactical and
pattern features in computer Go, in terms of move prediction and playing
strength, even though computer Go is a relatively well-developed research
area. A move prediction rate of 35.9% was achieved with tactical and decision
tree features, and they showed comparable performance to the state of the
art when integrated into an MCTS engine with progressive widening.
We conclude that the decision tree feature approach shows potential as
a method for automatically extracting domain knowledge in new domains.
These features can be used to evaluate state-action pairs for guiding searchbased
techniques, such as MCTS, or for action-prediction tasks. / AFRIKAANSE OPSOMMING: In rekenaar Go, word skuiwe gewoonlik geselekteer met behulp van ’n boomsoektogalgoritme.
Monte-Carlo boomsoektog (MCTS) is tans die dominante
algoritme in rekenaar Go. Dit is bekend dat die insluiting van gebiedskennis
in MCTS in staat is om die krag van MCTS enjins aansienlik te verbeter.
’n Suksesvolle benadering tot die voorstelling van gebiedskennis in rekenaar
Go is taktiek- en patroonkenmerke met geskikte gewigte. Hierdie behels ’n
aantal handgemaakte heuristieke en ’n versameling van patrone onderskeidelik.
Omdat taktiekkenmerke spesifiek vir Go met die hand gemaak is, en dat
patroonkenmerke Go-spesifiek is, is dit nie duidelik hoe hulle maklik oorgedra
kan word na ander velde toe nie.
Hierdie werk stel dus ’n nuwe verteenwoordiging van gebiedskennis voor,
naamlik besluitboomkenmerke. Hierdie kenmerke evalueer ’n toestand-aksie
paar deur rekursief die toevoerruimte van toestand-aksie pare te verdeel deur
middel van die keuses in die besluitboom, en dan die gewig terug te keer
wat ooreenstem met die verdelingselement wat die ooreenstemmende blaarnodus
verteenwoordig. In hierdie werk, is besluitboomkenmerke geëvalueer
op rekenaar Go, om hul lewensvatbaarheid in vergelyking met veldleidende
gebruik van taktiek- en patroonkenmerke te bepaal. In hierdie toepassing
van besluitboomkenmerke, verfyn elke navraag in die pad na onder van die
besluitboom inligting oor die posisie rondom ’n kandidaatskuif.
Die resultate van hierdie werk het getoon dat ’n kenmerkentiteit met
besluitboomkenmerke ’n haalbare alternatief is vir die veldleidende gebruik
van taktiek- en patroonkenmerke in rekenaar Go in terme van skuifvoorspelling
as ook speelkrag, ondanks die feit dat rekenaar Go ’n relatief goedontwikkelde
navorsingsgebied is. ’n Skuifvoorspellingskoers van 35.9% is
behaal met taktiek- en besluitboomkenmerke, en hulle het vergelykbaar met
veldleidende tegnieke presteer wanneer hulle in ’n MCTS enjin met progressiewe
uitbreiding geïntegreer is.
Ons lei af dat ons voorgestelde besluitboomkenmerke potensiaal toon as ’n
metode vir die outomaties onttrek van gebiedskennis in nuwe velde. Hierdie
eienskappe kan gebruik word om toestand-aksie pare te evalueer vir die leiding
van soektog-gebaseerde tegnieke, soos MCTS, of vir aksie-voorspelling.
|
418 |
Monte Carlo simulation of direction sensitive antineutrino detectionBlanckenberg, J. P (Jacobus Petrus) 03 1900 (has links)
Thesis (MSc (Physics))--University of Stellenbosch, 2010. / ENGLISH ABSTRACT:
Neutrino and antineutrino detection is a fairly new eld of experimental physics,
mostly due to the small interaction cross section of these particles. Most of
the detectors in use today are huge detectors consisting of kilotons of scintilator
material and large arrays of photomultiplier tubes. Direction sensitive
antineutrino detection has however, not been done (at the time of writing of
this thesis). In order to establish the feasibility of direction sensitive antineutrino
detection, a Monte Carlo code, DSANDS, was written to simulate the
detection process. This code focuses on the neutron and positron (the reaction
products after capture on a proton) transport through scintilator media. The
results are then used to determine the original direction of the antineutrino,
in the same way that data from real detectors would be used, and to compare
it with the known direction. Further investigation is also carried out into the
required amount of statistics for accurate results in an experimental eld where
detection events are rare. Results show very good directional sensitivity of the
detection method. / AFRIKAANSE OPSOMMING:
Neutrino en antineutrino meting is 'n relatief nuwe veld in eksperimentele sika,
hoofsaaklik as gevolg van die klein interaksie deursnee van hierdie deeltjies. Die
meeste hedendaagse detektors is massiewe detektors met kilotonne sintilator
materiaal en groot aantalle fotovermenigvuldiger buise. Tans is rigting sensitiewe
antineutrino metings egter nog nie uit gevoer nie. 'n Monte Carlo kode,
DSANDS, is geskryf om die meet proses te simuleer en sodoende die uitvoerbaarheid
van rigting sensitiewe antineutrino metings vas te stel. Hierdie kode
fokus op die beweging van neutrone en positrone (die reaksie produkte) deur
die sintilator medium. Die resultate word dan gebruik om die oorspronklike
rigting van die antineutrino te bepaal, soos met data van regte detektors gedoen
sou word, en te vergelyk met die bekende oorspronklike rigting van die
antineutrino. Verder word daar ook gekyk na die hoeveelheid statistiek wat
nodig sal wees om akkurate resultate te kry in 'n veld waar metings baie skaars
is. Die resultate wys baie goeie rigting sensitiwiteit van die meet metode.
|
419 |
The probability of occurrence and the intensity of tropical cyclones along the Southern African East coastRossouw, Cobus 12 1900 (has links)
Thesis (MEng (Civil Engineering))--University of Stellenbosch, 1999. / 100 leaves single printed pages, preliminary pages and numberd pages 1.1-9.1.Includes bibliography. List of figures, tables, symbols and acronyms. Scanned with a HP Scanjet 8250 Scanner to pdf format (OCR). / ENGLISH ABSTRACT:
A tropical cyclone is a non-frontal, synoptic scale, low-pressure system over tropical or subtropical
waters with organised convection and a definite cyclonic surface wind circulation. The system varies in size between a hundred and a few thousand kilometres in diameter with high
winds circulating around a central low pressure. The process of bringing the lower atmospheric
layers into thermodynamic equilibrium with the warm tropical waters add the energy to the
atmosphere and lower the surface pressure. If favourable climatic conditions exist this leads to
the formation of a warm core vortex, which can develop into a tropical cyclone. The occurrence
of tropical cyclones follows seasonal variations, the tropical cyclone season for the Southwest
Indian Ocean being between November and March. The occurrences peak along the Southern
African East Coast between Mid-January and Mid-February.
The data on the location and intensity of tropical cyclones along the Southeast Africa coastline
were obtained from the Joint Typhoon Warning Centre and span the period between 1848 and
1999. The available data before 1945 consist of tropical cyclone tracks that influenced
populated areas or were encountered by ships. It was assumed that a number of tropical
cyclones before 1945 were not recorded and therefore data collected before 1945 were
disregarded in the analysis. The development of radar in 1945 significantly improved the
detection of tropical cyclones. Some of the tropical cyclone tracks recorded between 1945 and
1956 contain information about the intensity of the tropical cyclone. Since the dawn of the
satellite age in the mid 1980's, the detection of tropical cyclones and intensity measurements
have improved vastly.
Monte Carlo simulation techniques were used to create long term data series based on the
available measured data. Statistical distributions were fitted for each characteristic describing
the tropical cyclone at its nearest position to the site under investigation.
Tropical cyclones frequently occur along the Southern African East Coast. The region where
more than one tropical cyclone per 100 years can be expected is bordered by latitudes 2.5°S to
32.5°S. The design parameters for structures in these regions should provide for the influence
that a tropical cyclone will have on the site. The occurrence rate and expected maximum
intensity of tropical cyclones with a 1DO-year return period vary with latitude along the Southern
African East Coast. The maximum number of tropical cyclones in a 1DO-year period occurs at
latitude 15°S with an expected number of tropical cyclones of 157.2 per 100 years. The
maximum expected tropical cyclone intensity in a 100-year period is 143.5 knots at latitude 17.5°S. / AFRIKAANSE OPSOMMING: Tropiese siklone is nie-frontale laagdrukstelsels wat hulle ontstaan het oor tropiese en subtropiese
oseane. 'n Stelsel bestaan uit 'n sentrale laagdrukstelsel met sirkulerende winde daar
om. 'n Sikloon se deursnee kan wissel van 'n honderd tot 'n paar duisend kilometer. 'n
Laagdrukstelsel ontstaan as gevolg van 'n termodinamiese wanbalans tussen die atmosfeer en
die warm oseaanwater in die trope. Indien die benodigde atmosferiese toestande heers kan die
laagdrukstelsel in 'n tropiese sikloon ontwikkel. In die Suidwestelike Indiese Oseaan vorm
tropiese siklone tussen November en Maart. Die meeste siklone kom hier voor vanaf middel
Januarie tot middel Februarie.
Data is verkry vanaf die "Joint Typhoon Warning Centre" vir die Suidwestelike Indiese Oseaan
en strek vanaf 1848 tot 1999. Die data voor 1945 verteenwoordig slegs die tropiese siklone wat
bewoonde areas of skeepsvaart beinvloed het. Daar is aangeneem dat 'n betekenisvolle getal
van die tropiese siklone voor 1945 nie gedokumenteer is nie en derhalwe is slegs data van
sikloon voorkomste na 1945 gebruik in die studie. Vanaf 1945 het die ontwikkeling van radar die
opsporing van siklone in onbewoonde areas moontlik gemaak. Die gebruik van weersatelliete
vanaf die middel 1980's het die kwaliteit van die data nog verder verbeter.
Monte Carlo simulasie tegnieke is gebruik om langtermyn data vir spesifieke posisies langs die
kus te genereer. Statistiese verdelings is gepas op die eienskappe wat die sikloon beskryf
wanneer dit die naaste posisie aan die terrein bereik. Die passing van die verdelings is gedoen
op die beskikbare historiese data. Die verdelings is dan gebruik om langtermyn data stelle te
skep vir die terrein.
Tropiese siklone kom gereeld in die Suidwestelike Indiese Oseaan voor en beinvloed die Suid-Afrikaanse
Ooskus. Meer as een tropiese sikloon kan elke 100 jaar verwag word in kusgebiede
tussen breedtegrade 2.5° S en 32.5° S. Die ontwerpe vir strukture in die gebied moet dus
voorsiening maak vir die invloed van tropiese siklone. Die voorkoms en intensiteit van tropiese
siklone varieer met breedtegraad langs die Suid-Afrikaanse Ooskus. Die meeste siklone word
verwag by breedtegraad 15°S met 'n gemiddelde van 157.2 siklone per 100 jaar. Die mees
intensiewe siklone kom voor by breedtegraad 17.5°S met 'n verwagte 1:100 jaar intensiteit van
143.5 knope.
|
420 |
Εφαρμογή του κώδικα Μόντε Κάρλο PENELOPE στην προσομοίωση κλινικών και φυσικών προβλημάτων ακτινοθεραπευτικής ογκολογίαςΜακρής, Δημήτρης 10 December 2013 (has links)
Ο PENELOPE αποτέλει σήμερα έναν ιδιαίτερα διαδεδομένο κώδικα στον τομέα της
φυσικής αλληλεπιδράσεων φωτονίων και ηλεκτρονίων με την ύλη. Τα χαρακτηριστικά
του είναι τέτοια, ώστε να είναι αρκετά προσαρμόσιμος κώδικας Μόντε Κάρλο σε
προβλήματα ακτινοφυσικής και ιδιαίτερα προβλήματα ακτινοθεραπευτικής
δοσιμετρίας και ακτινοθεραπευτικής ογκολογίας. Ο αρχικός σκοπός αυτής της
διπλωματικής εργασίας, ήταν να καταστεί αρχικά εκτελέσιμος ο κώδικας, να γίνει μια
διερεύνηση των χαρακτηριστικών και των δυνατοτήτων του, και τέλος να
χρησιμοποιηθεί για την πραγματοποίηση μιας σειράς απλών προσομοιώσεων
σχετιζόμενες με την ακτινοθεραπευτική ογκολογία. / Simulations using Monte Carlo code, PENELOPE, in Radiation Oncology.
|
Page generated in 0.0191 seconds