• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 73
  • 20
  • 9
  • 8
  • 3
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 144
  • 144
  • 21
  • 20
  • 19
  • 19
  • 15
  • 15
  • 15
  • 15
  • 14
  • 12
  • 12
  • 11
  • 11
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.

Stochastic models for compliance analysis and applications

Sun, Junfeng, January 2005 (has links)
Thesis (Ph. D.)--Ohio State University, 2005. / Available online via OhioLINK's ETD Center; full text release delayed at author's request until 2006 May 26

A discrete, stochastic model and correction method for bacterial source tracking

Leach, Mark Daniel, January 2007 (has links) (PDF)
Thesis (M.S. in electrical engineering)--Washington State University, May 2007. / Includes bibliographical references (p. 17-18).

A framework for stochastic modelling and optimisation of chemical engineering processes

Abubakar, Usman January 2014 (has links)
Uncertainties in chemical process performance behaviour continue to cause considerable concern to engineers and other stakeholders. The traditional deterministic uncertainty modelling methods lead to excess overdesign, which is expensive, and have also been shown to give limited insight into the behaviour of complex chemical engineering systems. The present work develops a new framework, termed “Stochastic Process Performance Modelling Framework (SPPMF)”, which combines traditional deterministic process simulation, response surface modelling techniques and advanced structural reliability analysis methods to facilitate efficient performance modelling and optimisation of chemical process systems under uncertainties. Cross application of structural reliability principles to chemical processes presents some challenges; however, means of addressing such issues are proposed and discussed in this thesis. For instance, to facilitate Process Reliability Analysis (PRA), stochastic constraints have been added to the conventional process optimisation formulation. Both first order reliability method and Monte Carlo simulation are then applied to gain a wide range of performance measures. In addition, to allow for automated response surface generation, an interface for linking process simulators and a new stochastic module has been developed; making it possible to obtain samples in the order of thousands, typically in minutes. A number of Structural Reliability Analysis (SRA) concepts have been re-defined to reflect the unique characteristics of chemical processes. For example, while SRA is mainly concerned with the effects of random forces and mechanical properties on structural performance, PRA is focused on random process conditions (e.g. changes in pH, reaction rates, etc) and their effects on both product quantity and quality. Finally, SPPMF has been successfully applied to model stochastic properties of a range of typical process systems. The results show that the new framework can be efficiently implemented in process engineering with significant benefits over the traditional methods. Limitations of SPPMF and directions for future work are also highlighted. This thesis contains commercially confidential information which should not be divulged to any third party without the written consent of the author.

Application of stochastic models to radiation chemistry

Pimblott, Simon M. January 1988 (has links)
This thesis addresses one area of major interest in reaction kinetics, the theoretical description of recombination in nonhomogeneous systems. The reaction of the highly reactive particles formed by the passage of ionising radiation through a medium is an important example of this type of system. Stochastic effects are apparent in every stage of the development of a radiolysis system: in the interaction between the radiation and the medium and in the diffusion and reaction processes that follow. Conventional models for nonhomogeneous kinetics in radiation chemistry are based upon a deterministic analysis. These models are appraised together with an alternative stochastic approach. Three stochastic methods are discussed: a full Monte Carlo simulation of the diffusion and reaction and two approximate models based upon an independent pairs approximation. It is important that any kinetic model accurately describes the system it purports to represent and this can only be assured by extensive validation. The stochastic models are developed to encompass the diffusion-controlled reactions of ions and radicals and to include the effects of a bulk electrolyte upon the reactions between ions. To model a realistic radiation chemistry reaction scheme, it is necessary to consider reactions that are not fully diffusion-controlled. The radiation boundary condition is introduced and used to extend stochastic modelling to partially diffusion-controlled reactions. Recombination in an anisotropic environment is also considered. Although a complete analysis of the chemistry of a radiolysis system requires a complex reaction scheme, the scheme can be simplified, in acid and in alkali, by the use of an instantaneous scavenging approximation. In acid, this approximation produces a simple three reaction mechanism based upon five species: H, OH, H<sub>2</sub> , H<sub>2</sub>0 and H<sub>2</sub>0<sub>2</sub> . The acid system is used to demonstrate the stochastic treatment of realistic kinetics. The instantaneous scavenging approximation is examined in detail and techniques are developed for the explicit modelling of reactions with a homogeneously distributed scavenger. A stochastic treatment of nonhomogeneous reaction kinetics requires a description of the initial spatial distribution of the reacting particles. A rudimentary Monte Carlo simulation is used to determine a simple distribution of clusters of reactive particles similar to that found along the path of a high energy electron in water. This distribution provides a suitable basis for kinetic simulation. The kinetics of a more detailed idealised track structure are also considered and the stochastic and deterministic kinetics of extended particle distributions are discussed.

Approximation to random process by wavelet basis.

Li, Zheng. January 2008 (has links)
Thesis (Ph.D.)--Brown University, 2008. / Vita. Advisor : Donald McClure. Includes bibliographical references (leaves 87-90).

The use of stochastic models of infectious disease transmission for public health: schistosomiasis japonica

Ning, Yao., 宁耀. January 2010 (has links)
published_or_final_version / Community Medicine / Master / Master of Philosophy

Stochastic programming in revenue management

Chen, Lijian, January 2006 (has links)
Thesis (Ph. D.)--Ohio State University, 2006. / Title from first page of PDF file. Includes bibliographical references (p. 97-101).

An exploration of stochastic models

Gross, Joshua January 1900 (has links)
Master of Science / Department of Mathematics / Nathan Albin / The term stochastic is defined as having a random probability distribution or pattern that may be analyzed statistically but may not be predicted precisely. A stochastic model attempts to estimate outcomes while allowing a random variation in one or more inputs over time. These models are used across a number of fields from gene expression in biology, to stock, asset, and insurance analysis in finance. In this thesis, we will build up the basic probability theory required to make an ``optimal estimate", as well as construct the stochastic integral. This information will then allow us to introduce stochastic differential equations, along with our overall model. We will conclude with the "optimal estimator", the Kalman Filter, along with an example of its application.

Stochastic models for asset and liability modelling in South Africa or elsewhere

Maitland, Alexander James 16 September 2011 (has links)
Ph. D, Faculty of Science, University of Witwatersrand, 2011 / Research in the area of stochastic models for actuarial use in South Africa is limited to relatively few publications. Until recently, there has been little focus on actuarial stochastic models that describe the empirical stochastic behaviour of South African financial and economic variables. A notable exception is Thomson’s (1996) proposed methodology and model. This thesis presents a collection of five papers that were presented at conferences or submitted for peer review in the South African Actuarial Journal between 1996 and 2006. References to subsequent publications in the field are also provided. Such research has implications for medium and long-term financial simulations, capital adequacy, resilience reserving and asset allocation benchmarks as well as for the immunization of short-term interest rate risk, for investment policy determination and the general quantification and management of risk pertaining to those assets and liabilities. This thesis reviews Thomson’s model and methodology from both a statistical and economic perspective, and identifies various problems and limitations in that approach. New stochastic models for actuarial use in South Africa are proposed that improve the asset and liability modelling process and risk quantification. In particular, a new Multiple Markov-Switching (MMS) model framework is presented for modelling South African assets and liabilities, together with an optimal immunization framework for nominal liability cash flows. The MMS model is a descriptive model with structural features and parameter estimates based on historical data. However, it also incorporates theoretical aspects in its design, thereby providing a balance between purely theoretical models and those based only on empirical considerations.

Distributionally Robust Performance Analysis: Data, Dependence and Extremes

He, Fei January 2018 (has links)
This dissertation focuses on distributionally robust performance analysis, which is an area of applied probability whose aim is to quantify the impact of model errors. Stochastic models are built to describe phenomena of interest with the intent of gaining insights or making informed decisions. Typically, however, the fidelity of these models (i.e. how closely they describe the underlying reality) may be compromised due to either the lack of information available or tractability considerations. The goal of distributionally robust performance analysis is then to quantify, and potentially mitigate, the impact of errors or model misspecifications. As such, distributionally robust performance analysis affects virtually any area in which stochastic modelling is used for analysis or decision making. This dissertation studies various aspects of distributionally robust performance analysis. For example, we are concerned with quantifying the impact of model error in tail estimation using extreme value theory. We are also concerned with the impact of the dependence structure in risk analysis when marginal distributions of risk factors are known. In addition, we also are interested in connections recently found to machine learning and other statistical estimators which are based on distributionally robust optimization. The first problem that we consider consists in studying the impact of model specification in the context of extreme quantiles and tail probabilities. There is a rich statistical theory that allows to extrapolate tail behavior based on limited information. This body of theory is known as extreme value theory and it has been successfully applied to a wide range of settings, including building physical infrastructure to withstand extreme environmental events and also guiding the capital requirements of insurance companies to ensure their financial solvency. Not surprisingly, attempting to extrapolate out into the tail of a distribution from limited observations requires imposing assumptions which are impossible to verify. The assumptions imposed in extreme value theory imply that a parametric family of models (known as generalized extreme value distributions) can be used to perform tail estimation. Because such assumptions are so difficult (or impossible) to be verified, we use distributionally robust optimization to enhance extreme value statistical analysis. Our approach results in a procedure which can be easily applied in conjunction with standard extreme value analysis and we show that our estimators enjoy correct coverage even in settings in which the assumptions imposed by extreme value theory fail to hold. In addition to extreme value estimation, which is associated to risk analysis via extreme events, another feature which often plays a role in the risk analysis is the impact of dependence structure among risk factors. In the second chapter we study the question of evaluating the worst-case expected cost involving two sources of uncertainty, each of them with a specific marginal probability distribution. The worst-case expectation is optimized over all joint probability distributions which are consistent with the marginal distributions specified for each source of uncertainty. So, our formulation allows to capture the impact of the dependence structure of the risk factors. This formulation is equivalent to the so-called Monge-Kantorovich problem studied in optimal transport theory, whose theoretical properties have been studied in the literature substantially. However, rates of convergence of computational algorithms for this problem have been studied only recently. We show that if one of the random variables takes finitely many values, a direct Monte Carlo approach allows to evaluate such worst case expectation with $O(n^{-1/2})$ convergence rate as the number of Monte Carlo samples, $n$, increases to infinity. Next, we continue our investigation of worst-case expectations in the context of multiple risk factors, not only two of them, assuming that their marginal probability distributions are fixed. This problem does not fit the mold of standard optimal transport (or Monge-Kantorovich) problems. We consider, however, cost functions which are separable in the sense of being a sum of functions which depend on adjacent pairs of risk factors (think of the factors indexed by time). In this setting, we are able to reduce the problem to the study of several separate Monge-Kantorovich problems. Moreover, we explain how we can even include martingale constraints which are often natural to consider in settings such as financial applications. While in the previous chapters we focused on the impact of tail modeling or dependence, in the later parts of the dissertation we take a broader view by studying decisions which are made based on empirical observations. So, we focus on so-called distributionally robust optimization formulations. We use optimal transport theory to model the degree of distributional uncertainty or model misspecification. Distributionally robust optimization based on optimal transport has been a very active research topic in recent years, our contribution consists in studying how to specify the optimal transport metric in a data-driven way. We explain our procedure in the context of classification, which is of substantial importance in machine learning applications.

Page generated in 0.2917 seconds