• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2284
  • 1057
  • 675
  • 185
  • 120
  • 103
  • 68
  • 55
  • 53
  • 53
  • 35
  • 32
  • 30
  • 24
  • 23
  • Tagged with
  • 5612
  • 5612
  • 1665
  • 1370
  • 571
  • 533
  • 532
  • 526
  • 424
  • 414
  • 395
  • 379
  • 329
  • 329
  • 309
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
341

Gonihedric 3D Ising models

Malmini, Ranasinghe P. K. C. January 1997 (has links)
No description available.
342

Applying MCMC methods to multi-level models

Browne, William J. January 1998 (has links)
No description available.
343

A feasibility study of exploration for deep seated ore bodies in the Skellefte field

Malmqvist, Kerstin January 1979 (has links)
Simulation technique has been used for a feasibility study of a deep exploration project for massive sulphides in an old mining district, the Skellefte field. The outcome under very different conditions has been studied. Under the specific conditions of the well known Skellefte field it is found to be possible to even calibrate the mathematical model.It is found that when the geology is not known in detail, an outcome of the order of 50 tons per meter drillhole is to be expected under a simple drilling strategy.When a certain knowledge about the general structures down to around 1 000 m is established, it is possible to improve the outcome by a factor of 2 through an optimization of the depth of investigation. The optimal depth of investigation is in the order of 500 m.On the other hand, when a minimum ore value is introduced as a function of depth, the expected outcome will again decrease with a factor of about 3.It must be underlined, that these results are average values in a mathematical model and do not say anything about the outcome in the single exploration case. However, in exploration campaigns of the order of 40 drillholes to a depth of 1000 m an analysis shows that at least one deep seated large body was found in 25% ot the exploration campaigns.Faced with the problem whether to go or not to go into a deep exploration phase, this technique can headlight the problem and it can give an estimate about the order of costs and benefits. / digitalisering@umu
344

Monte Carlo integration in discrete undirected probabilistic models

Hamze, Firas 05 1900 (has links)
This thesis contains the author’s work in and contributions to the field of Monte Carlo sampling for undirected graphical models, a class of statistical model commonly used in machine learning, computer vision, and spatial statistics; the aim is to be able to use the methodology and resultant samples to estimate integrals of functions of the variables in the model. Over the course of the study, three different but related methods were proposed and have appeared as research papers. The thesis consists of an introductory chapter discussing the models considered, the problems involved, and a general outline of Monte Carlo methods. The three subsequent chapters contain versions of the published work. The second chapter, which has appeared in (Hamze and de Freitas 2004), is a presentation of new MCMC algorithms for computing the posterior distributions and expectations of the unknown variables in undirected graphical models with regular structure. For demonstration purposes, we focus on Markov Random Fields (MRFs). By partitioning the MRFs into non-overlapping trees, it is possible to compute the posterior distribution of a particular tree exactly by conditioning on the remaining tree. These exact solutions allow us to construct efficient blocked and Rao-Blackwellised MCMC algorithms. We show empirically that tree sampling is considerably more efficient than other partitioned sampling schemes and the naive Gibbs sampler, even in cases where loopy belief propagation fails to converge. We prove that tree sampling exhibits lower variance than the naive Gibbs sampler and other naive partitioning schemes using the theoretical measure of maximal correlation. We also construct new information theory tools for comparing different MCMC schemes and show that, under these, tree sampling is more efficient. Although the work discussed in Chapter 2 exhibited promise on the class of graphs to which it was suited, there are many cases where limiting the topology is quite a handicap. The work in Chapter 3 was an exploration in an alternative methodology for approximating functions of variables representable as undirected graphical models of arbitrary connectivity with pairwise potentials, as well as for estimating the notoriously difficult partition function of the graph. The algorithm, published in (Hamze and de Freitas 2005), fits into the framework of sequential Monte Carlo methods rather than the more widely used MCMC, and relies on constructing a sequence of intermediate distributions which get closer to the desired one. While the idea of using “tempered” proposals is known, we construct a novel sequence of target distributions where, rather than dropping a global temperature parameter, we sequentially couple individual pairs of variables that are, initially, sampled exactly from a spanning treeof the variables. We present experimental results on inference and estimation of the partition function for sparse and densely-connected graphs. The final contribution of this thesis, presented in Chapter 4 and also in (Hamze and de Freitas 2007), emerged from some empirical observations that were made while trying to optimize the sequence of edges to add to a graph so as to guide the population of samples to the high-probability regions of the model. Most important among these observations was that while several heuristic approaches, discussed in Chapter 1, certainly yielded improvements over edge sequences consisting of random choices, strategies based on forcing the particles to take large, biased random walks in the state-space resulted in a more efficient exploration, particularly at low temperatures. This motivated a new Monte Carlo approach to treating complex discrete distributions. The algorithm is motivated by the N-Fold Way, which is an ingenious event-driven MCMC sampler that avoids rejection moves at any specific state. The N-Fold Way can however get “trapped” in cycles. We surmount this problem by modifying the sampling process to result in biased state-space paths of randomly chosen length. This alteration does introduce bias, but the bias is subsequently corrected with a carefully engineered importance sampler.
345

Simulation of the transmitted dose in an EPID using a Monte Carlo method.

Pham, Thuc M. January 2009 (has links)
The BEAMnrc and DOSXYZnrc codes from EGSnrc Monte Carlo (MC) system are considered to be the gold standards for simulating radiotherapy linear accelerators and resulting dose depositions (Rogers, Faddegon et al. 1995). The aim of this project was to setup the EGSnrc system for the simulation of the linear accelerator (linac) head and a Scanning Liquid Ionisation Chamber (SLIC) Electronic Portal Imaging Device (EPID) for calculations of transmitted dose in the EPID. The project was divided into two parts. The head of a 6 MV Varian 600C/D photon linac was first simulated by BEAMnrc. The modelling parameters such as the electron beam energy and the Full Width at Half Maximum (FWHM) of the electron spatial distribution were adjusted until the absorbed dose profiles and the Percentage Depth Dose (PDD) curves, in general agreed better than the measured profiles and PDDs by 2%. The X-ray beam obtained from the modelled linac head was used for the simulation of the transmitted dose in the EPID in the second part of the project. The EPID was simulated by DOSXYZnrc based on the information obtained from Spezi and Lewis 2002 (Spezi and Lewis 2002), who also modelled the Varian SLIC EPID (MK2 Portal Vision system, Varian Inc., Palo Alto, CA, USA). The comparisons between the measured and the simulated transmitted doses were carried out for three different phantom setups consisting of an open field, homogeneous water equivalent phantom and a humanoid phantom (RANDO). These phantom setups were designed so that the accuracy of the MC method for simulating absorbed dose in air, homogeneous and inhomogeneous phantoms could be assessed. In addition, the simulated transmitted dose in an EPID was also compared with values obtained from the Pinnacle treatment planning system (v6.2b, Phillips Medical Systems). In the process of selecting the electron beam energy and FWHM, it was confirmed (Sheikh-Bagheri and Rogers 2002; Keall, Siebers et al. 2003) that the variation of the electron beam FWHM and energy influenced the beam profiles strongly. The PDD was influenced by the electron beam energy less strongly. The increase in the energy led to the increase in the depth of maximum dose. However, the effect could not be observed until the energy change of 0.2 MeV was made. Based on the analysis of the results, it was found that the combination of FWHM and energy of 1.3 mm and 5.7 MeV provided the best match between the measured and MC simulated beam profiles and PDDs. It can be concluded that an accuracy of 1.5% can be achieved in the simulation of the linac head using Monte Carlo method. In the comparison between the Monte Carlo and the measured transmitted dose maps, agreements of 2% were found for both the open field and homogeneous water equivalent phantom setups. The same agreements were also found for the comparison between Monte Carlo and Pinnacle transmitted dose maps for these setups. In the setup where the humanoid phantom RANDO was introduced in between the radiation field and the EPID, a general agreement of about 5% found for the comparison between Monte Carlo and measured transmitted dose maps. Pinnacle and measured transmitted dose map was also compared for this setup and the same agreement was found. / http://proxy.library.adelaide.edu.au/login?url= http://library.adelaide.edu.au/cgi-bin/Pwebrecon.cgi?BBID=1352973 / Thesis (M.Sc.) - University of Adelaide, School of Chemistry and Physics, 2009
346

Free energy techniques for the computer simulation of surface tension with applications to curved surfaces /

Moody, Michael. Unknown Date (has links)
Free energy techniques provide the basis for an analysis of aspects of the liquid-vapour interface undertaken in this study. The main focus of this work is an extensive theoretical investigation into properties of the surface tension, including curvature dependance and supersaturation effects, using Monte Carlo computer simulation techniques. / Thesis (PhD)--University of South Australia, 2002.
347

Monte Carlo calculated organ doses from computed tomography examinations using a newly constructed paediatric voxel tomographic computational model /

Caon, Martin, Unknown Date (has links)
Thesis (PhD)--University of South Australia, 1999
348

New developments in the construction of lattice rules: applications of lattice rules to high-dimensional integration problems from mathematical finance.

Waterhouse, Benjamin James, School of Mathematics, UNSW January 2007 (has links)
There are many problems in mathematical finance which require the evaluation of a multivariate integral. Since these problems typically involve the discretisation of a continuous random variable, the dimension of the integrand can be in the thousands, tens of thousands or even more. For such problems the Monte Carlo method has been a powerful and popular technique. This is largely related to the fact that the performance of the method is independent of the number of dimensions. Traditional quasi-Monte Carlo techniques are typically not independent of the dimension and as such have not been suitable for high-dimensional problems. However, recent work has developed new types of quasi-Monte Carlo point sets which can be used in practically limitless dimension. Among these types of point sets are Sobol' sequences, Faure sequences, Niederreiter-Xing sequences, digital nets and lattice rules. In this thesis, we will concentrate on results concerning lattice rules. The typical setting for analysis of these new quasi-Monte Carlo point sets is the worst-case error in a weighted function space. There has been much work on constructing point sets with small worst-case errors in the weighted Korobov and Sobolev spaces. However, many of the integrands which arise in the area of mathematical finance do not lie in either of these spaces. One common problem is that the integrands are unbounded on the boundaries of the unit cube. In this thesis we construct function spaces which admit such integrands and present algorithms to construct lattice rules where the worst-case error in this new function space is small. Lattice rules differ from other quasi-Monte Carlo techniques in that the points can not be used sequentially. That is, the entire lattice is needed to keep the worst-case error small. It has been shown that there exist generating vectors for lattice rules which are good for many different numbers of points. This is a desirable property for a practitioner, as it allows them to keep increasing the number of points until some error criterion is met. In this thesis, we will develop fast algorithms to construct such generating vectors. Finally, we apply a similar technique to show how a particular type of generating vector known as the Korobov form can be made extensible in dimension.
349

Radial distribution functions for an hydrogenous plasma in equilibrium / by A. A. Barker.

Barker, A. A. (Anthony Alfred) January 1968 (has links)
Includes 4 reprints by the author / [159] leaves : ill. ; 30 cm. / Title page, contents and abstract only. The complete thesis in print form is available from the University Library. / Radial distribution functions gab(r) for a dense hydrogenous plasma in equilibrium near the ionization temperature are obtained by two methods / Thesis (Ph.D.)--University of Adelaide, Dept. of Mathematical Physics, 1968
350

Monte Carlo Investigation into Superficial Cancer Treatments of the Head and Neck

Currie, Bryn Edward January 2007 (has links)
This thesis presents the findings of the investigation into the Monte Carlo simulation of superficial cancer treatments of the head and neck region. The EGSnrc system of codes for the Monte Carlo simulation of the transport of electrons and photons through a phantom representative of either a water phantom or treatment site in a patient is utilised. Two clinical treatment units are simulated using the BEAMnrc system of codes: the Varian Medical Systems Clinac® 2100C accelerator for 6MeV electron fields and the Pantak Therapax SXT 150 X-ray unit for 80kV and 100kV photon fields. Depth dose, profile and isodose curves are compared against those measured from a PTW MP3 water phantom with good agreement being achieved. Quantitative dose distributions are determined for both MeV electron and kV photon fields with treatment sites containing high atomic number materials, rapidly sloping surfaces and different density interfaces. This highlights the relatively high level of dose deposition of dose in tissue-bone and tissue-cartilage interfaces in the kV photon fields. From these dose distributions DVH and dose comparators are used to assess the simulated treatment fields.

Page generated in 0.055 seconds