• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 484
  • 220
  • 85
  • 66
  • 34
  • 30
  • 26
  • 25
  • 13
  • 9
  • 8
  • 6
  • 4
  • 4
  • 2
  • Tagged with
  • 1093
  • 1093
  • 1093
  • 121
  • 121
  • 100
  • 99
  • 95
  • 79
  • 68
  • 67
  • 63
  • 63
  • 56
  • 56
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
401

A second order Runge–Kutta method for the Gatheral model

Auffredic, Jérémy January 2020 (has links)
In this thesis, our research focus on a weak second order stochastic Runge–Kutta method applied to a system of stochastic differential equations known as the Gatheral Model. We approximate numerical solutions to this system and investigate the rate of convergence of our method. Both call and put options are priced using Monte-Carlo simulation to investigate the order of convergence. The numerical results show that our method is consistent with the theoretical order of convergence of the Monte-Carlo simulation. However, in terms of the Runge-Kutta method, we cannot accept the consistency of our method with the theoretical order of convergence without further research.
402

Product Deletion and Supply Chain Management

Zhu, Qingyun 19 April 2019 (has links)
One of the most significant changes in the evolution of modern business management is that organizations no longer compete as individual entities in the market, but as interlocking supply chains. Markets are no longer simply trading desks but dynamic ecosystems where people, organizations and the environment interact. Products and associated materials and resources are links that bridge supply chains from upstream (sourcing and manufacturing) to downstream (delivering and consuming). The lifecycle of a product plays a critical role in supply chains. Supply chains may be composed by, designed around, and modified for products. Product-related issues greatly impact supply chains. Existing studies have advanced product management and product lifecycle management literature through dimensions of product innovation, product growth, product line extensions, product efficiencies, and product acquisition. Product deletion, rationalization, or reduction research is limited but is a critical issue for many reasons. Sustainability is an important reason for this managerial decision. This study, grounded from multiple literature streams in both marketing and supply chain fields, identified relations and propositions to form a firm-level analysis on the role of supply chains in organizational product deletion decisions. Interviews, observational and archival data from international companies (i.e.: Australia, China, India, and Iran) contributed to the empirical support as case studies through a grounded theory approach. Bayesian analysis, an underused empirical analysis tool, was utilized to provide insights into this underdeveloped research stream; and its relationship to qualitative research enhances broader methodological understanding. Gibbs sampler and reversible jump Markov chain Monte Carlo (MCMC) simulation were used for Bayesian analysis based on collected data. The integrative findings are exploratory but provide insights for a number of research propositions.
403

Option Pricing Under the Markov-switching Framework Defined by Three States

Castoe, Minna, Raspudic, Teo January 2020 (has links)
An exact solution for the valuation of the options of the European style can be obtained using the Black-Scholes model. However, some of the limitations of the Black-Scholes model are said to be inconsistent such as the constant volatility of the stock price which is not the case in real life. In this thesis, the Black-Scholes model is extended to a model where the volatility is fully stochastic and changing over time, modelled by Markov chain with three states - high, medium and low. Under this model, we price options of both types, European and American, using Monte Carlo simulation.
404

3D Scintillation Positioning Method in a Breast-specific Gamma Camera

Wang, Beien January 2015 (has links)
In modern clinical practice, gamma camera is one of the most important imaging modalities for tumour diagnosis. The standard technique uses scintillator-based gamma cameras equipped with parallel-hole collimator to detect the planar position of γ photon interaction (scintillation). However, the positioning is of insufficient resolution and linearity for breast imaging. With the aim to improve spatial resolution and positioning linearity, a new gamma camera configuration was described specifically for breast-imaging. This breast-specific gamma camera was supposed to have the following technical features: variable angle slant-hole collimator; double SiPM arrays readout at the front and back sides of the scintillator; diffusive reflectors at the edges around the scintillator. Because slant-hole collimator was used, a new 3D scintillation positioning method was introduced and tested. The setup of the gamma detector was created in a Monte Carlo simulation toolkit, and a library of a number of light distributions from known positions was acquired through optical simulation. Two library-based positioning algorithms, similarity comparison and maximum likelihood, were developed to estimate the 3D scintillation position by comparing the responses from simulated gamma interactions and the responses from library. Results indicated that the planar spatial resolution and positioning linearity estimated with this gamma detector setup and positioning algorithm was higher than the conventional gamma detectors. The depth-of-interaction estimation was also of high linearity and resolution. With the results presented, the gamma detector setup and positioning method is promising in future breast cancer diagnosis.
405

Evaluating Model Fit for Longitudinal Measurement Invariance with Ordered Categorical Indicators

Clark, Jonathan Caleb 08 December 2020 (has links)
Current recommended cutoffs for determining measurement invariance have typically derived from simulation studies that have focused on multigroup confirmatory factor analysis, often using continuous data. These cutoffs may be inappropriate for ordered categorical data in a longitudinal setting. This study conducts two Monte Carlo studies that evaluate the performance of four popular model fit indices used to determine measurement invariance. The comparative fit index (CFI), Tucker-Lewis Index (TLI), and root mean square error of approximation (RMSEA) were all found to be inconsistent across various simulation conditions as well as invariance tests, and thus were not recommended for use in longitudinal measurement invariance testing. The standardized root mean square residual (SRMR) was the most consistent and robust fit index across simulation conditions, and thus we recommended using ≥ 0.01 as a cutoff for determining longitudinal measurement invariance with ordered categorical indicators.
406

Risk management of groundwater drawdown in settlement sensitive areas.

Tisell, Victoria January 2013 (has links)
A new method for estimation of risks at settlement calculations is presented. By quantifying uncertainties of settlement calculations, it is possible to make risk analysis and to compare the costs of risk reducing efforts with the benefit these efforts would lead to. The settlement estimations are done by combining uncertainties about soil data, groundwater drawdown and settlement calculations. This master degree thesis describes how the groundwater drawdown is estimated using a numerical model. The model reflects the groundwater decrease around a drainage well with respect to estimated groundwater recharge, dependent on the geology and precipitation. There are four parameters in the model which are connected to soil properties and precipitation; hydraulic conductivity for clay, hydraulic conductivity for till, hydraulic conductivity for sand and mean annual net precipitation. Drawdown is estimated in a deterministic and a probabilistic model, where the probabilistic model uses stochastic parameter values in a Monte Carlo simulation. The risks concerning settlements are found when the groundwater model is integrated with a soil model and a settlement model. When integrated, the new model estimates risks related to all three separate models. Results of groundwater drawdown and ground settlement estimations are spatially presented in a sensitivity and risk analysis. By finding and comparing the most influencing parameters of the settlement, project decision makers will have an easier task deciding on what further measures should be focused on.
407

Multilevel Monte Carlo Simulation for American Option Pricing

Colakovic, Sabina, Ågren, Viktor January 2021 (has links)
In this thesis, we center our research around the analytical approximation of American put options with the Multilevel Monte Carlo simulation approach. The focus lies on reducing the computational complexity of estimating an expected value arising from a stochastic differential equation. Numerical results showcase that the simulations are consistent with the theoretical order of convergence of Monte Carlo simulations. The approximations are accurate and considerately more computationally efficient than the standard Monte Carlo simulation method.
408

Simulating Univariate and Multivariate Burr Type III and Type XII Distributions Through the Method of L-Moments

Pant, Mohan Dev 01 August 2011 (has links)
The Burr families (Type III and Type XII) of distributions are traditionally used in the context of statistical modeling and for simulating non-normal distributions with moment-based parameters (e.g., Skew and Kurtosis). In educational and psychological studies, the Burr families of distributions can be used to simulate extremely asymmetrical and heavy-tailed non-normal distributions. Conventional moment-based estimators (i.e., the mean, variance, skew, and kurtosis) are traditionally used to characterize the distribution of a random variable or in the context of fitting data. However, conventional moment-based estimators can (a) be substantially biased, (b) have high variance, or (c) be influenced by outliers. In view of these concerns, a characterization of the Burr Type III and Type XII distributions through the method of L-moments is introduced. Specifically, systems of equations are derived for determining the shape parameters associated with user specified L-moment ratios (e.g., L-Skew and L-Kurtosis). A procedure is also developed for the purpose of generating non-normal Burr Type III and Type XII distributions with arbitrary L-correlation matrices. Numerical examples are provided to demonstrate that L-moment based Burr distributions are superior to their conventional moment based counterparts in the context of estimation, distribution fitting, and robustness to outliers. Monte Carlo simulation results are provided to demonstrate that L-moment-based estimators are nearly unbiased, have relatively small variance, and are robust in the presence of outliers for any sample size. Simulation results are also provided to show that the methodology used for generating correlated non-normal Burr Type III and Type XII distributions is valid and efficient. Specifically, Monte Carlo simulation results are provided to show that the empirical values of L-correlations among simulated Burr Type III (and Type XII) distributions are in close agreement with the specified L-correlation matrices.
409

Uncertainty Qualification of Photothermal Radiometry Measurements Using Monte Carlo Simulation and Experimental Repeatability

Fleming, Austin 01 May 2014 (has links)
Photothermal Radiometry is a common thermal property measurement technique which is used to measure the properties of layered materials. Photothermal Radiometry uses a modulated laser to heat a sample, in which the thermal response can be used to determine the thermal properties of layers in the sample. The motivation for this work is to provide a better understanding of the accuracy and the repeatability of the Photothermal Radiometry measurement technique. Through this work the sensitivity of results to input uncertainties will be determined. Additionally, using numerical simulations the overall uncertainty on a theoretical measurement will be determined. The repeatability of Photothermal Radiometry measurements is tested with the use of a proton irradiated zirconium carbide sample. Due to the proton irradiation this sample contains two layers with a thermal resistance between the layers. This sample has been independently measured by three different researchers, in three different countries and the results are compared to determine the repeatability of Photothermal Radiometry measurements. Finally, from sensitivity and uncertainty analysis experimental procedures and suggestions are provided to reduce the uncertainty in experimentally measured results.
410

Input-output analysis on the economic impact of medical care in Japan / 産業連関分析を用いた医療の経済波及効果の推計

Yamada, Go 23 March 2016 (has links)
京都大学 / 0048 / 新制・論文博士 / 博士(社会健康医学) / 乙第13005号 / 論社医博第9号 / 新制||社医||9(附属図書館) / 32933 / 京都大学大学院医学研究科社会健康医学系専攻 / (主査)教授 川上 浩司, 教授 玉木 敬二, 教授 小西 靖彦 / 学位規則第4条第2項該当 / Doctor of Public Health / Kyoto University / DFAM

Page generated in 0.3677 seconds