• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 43
  • 43
  • 11
  • 4
  • 4
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 137
  • 137
  • 30
  • 27
  • 22
  • 18
  • 17
  • 16
  • 15
  • 12
  • 12
  • 11
  • 11
  • 10
  • 9
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

New and hybrid methods for simulating biochemical systems

Greenfield, Daniel Leo, Computer Science & Engineering, Faculty of Engineering, UNSW January 2006 (has links)
It is a dream of Systems-Biology to efficiently simulate an entire cell on a computer. The potential medical and biological applications of such a tool are immense, and so are the challenges to accomplish it. At the level of a cell, the number of reacting molecules is so low that stochastic effects can be crucial in deciding the system-level behaviour of the cell. Despite the recent development of many new and hybrid stochastic approaches, exact stochastic simulation algorithms are still needed, and are widely employed in most current biochemical simulation packages. Unfortunately, the performance of these algorithms scales badly with the number of reactions. It is shown that this is especially the case for hubs and scale-free networks. This is worrying because hubs are an important component of biochemical systems, and it is widely suspected that biochemical networks are scale-free. It is shown that the scalability issue in these algorithms is due to the high interdependency between reactions. A general method for arbitrarily reducing this interdependency is introduced, and it is shown how it can be used for many classes of simulation processes. This is applied to one of the fastest algorithms currently, the Next Reaction Method. The resulting algorithm, the Reactant-Margin Method, is tested on a wide range of hub sizes and shown to be asymptotically faster than the current best algorithms. Hybrid versions of the Reactant-Margin Method and the Next Reaction Method are also compared on a real biological model - the Lambda-Phage virus, and the new algorithm is again shown to perform better. The problems inherent in the hybridization are also shown to be more exactly and efficiently handled in the Reactant-Margin framework than in the Next-Reaction Method framework. Finally, a software tool called GeNIV is introduced. This GUI-based biochemical modelling and simulation tool is an embodiment of a mechanistic-representation philosophy. It is implements the Reactant Margin and Next Reaction hybrid algorithms, and has a simple representation system for gene-state occupancy and their subsequent biochemical reactions. It is also novel in that it translates the graphical model into Javacode which is compiled and executed for simulation.
2

The effect of calving season on economic risk and return in cow-calf operations in western Canada

Sirski, Tanis 24 August 2012 (has links)
Cow-calf producers in western Canada are faced with many decisions throughout the production cycle. The choice of calving time impacts production rate, marketability of calves, income and expenses and net revenue. The purpose of this study was to determine whether June calving could increase net revenues and be a preferred choice across different risk aversion levels over March calving in western Canada. Data for this study were taken from a study carried out by Iwaasa et al. (2009), who collected information from three sites; Brandon, MB, Lanigan, SK and Swift Current SK. Stochastic budgets and a simulation model were used to study the economic impact of calving time. In Brandon and Lanigan, It was found that June calving increased net income and was the dominant alternative across all levels of risk aversion, and in Swift Current, June dominated at high-risk aversion levels.
3

The effect of calving season on economic risk and return in cow-calf operations in western Canada

Sirski, Tanis 24 August 2012 (has links)
Cow-calf producers in western Canada are faced with many decisions throughout the production cycle. The choice of calving time impacts production rate, marketability of calves, income and expenses and net revenue. The purpose of this study was to determine whether June calving could increase net revenues and be a preferred choice across different risk aversion levels over March calving in western Canada. Data for this study were taken from a study carried out by Iwaasa et al. (2009), who collected information from three sites; Brandon, MB, Lanigan, SK and Swift Current SK. Stochastic budgets and a simulation model were used to study the economic impact of calving time. In Brandon and Lanigan, It was found that June calving increased net income and was the dominant alternative across all levels of risk aversion, and in Swift Current, June dominated at high-risk aversion levels.
4

A Gillespie-Type Algorithm for Particle Based Stochastic Model on Lattice

Liu, Weigang January 2019 (has links)
In this thesis, I propose a general stochastic simulation algorithm for particle based lattice model using the concepts of Gillespie's stochastic simulation algorithm, which was originally designed for well-stirred systems. I describe the details about this method and analyze its complexity compared with the StochSim algorithm, another simulation algorithm originally proposed to simulate stochastic lattice model. I compare the performance of both algorithms with application to two different examples: the May-Leonard model and Ziff-Gulari-Barshad model. Comparison between the simulation results from both algorithms has validate our claim that our new proposed algorithm is comparable to the StochSim in simulation accuracy. I also compare the efficiency of both algorithms using the CPU cost of each code and conclude that the new algorithm is as efficient as the StochSim in most test cases, while performing even better for certain specific cases. / Computer simulation has been developed for almost one century. Stochastic lattice model, which follows the physics concept of lattice, is defined as a kind of system in which individual entities live on grids and demonstrate certain random behaviors according to certain specific rules. It is mainly studied using computer simulations. The most widely used simulation method to for stochastic lattice systems is the StochSim algorithm, which just randomly pick an entity and then determine its behavior based on a set of specific random rules. Our goal is to develop new simulation methods so that it is more convenient to simulate and analyze stochastic lattice system. In this thesis I propose another type of simulation methods for the stochastic lattice model using totally different concepts and procedures. I developed a simulation package and applied it to two different examples using both methods, and then conducted a series of numerical experiment to compare their performance. I conclude that they are roughly equivalent and our new method performs better than the old one in certain special cases.
5

Stochastic Modeling and Simulation of Reaction-Diffusion Biochemical Systems

Li, Fei 10 March 2016 (has links)
Reaction Diffusion Master Equation (RDME) framework, characterized by the discretization of the spatial domain, is one of the most widely used methods in the stochastic simulation of reaction-diffusion systems. Discretization sizes for RDME have to be appropriately chosen such that each discrete compartment is "well-stirred" and the computational cost is not too expensive. An efficient discretization size based on the reaction-diffusion dynamics of each species is derived in this dissertation. Usually, the species with larger diffusion rate yields a larger discretization size. Partitioning with an efficient discretization size for each species, a multiple grid discretization (MGD) method is proposed. MGD avoids unnecessary molecular jumping and achieves great simulation efficiency improvement. Moreover, reaction-diffusion systems with reaction dynamics modeled by highly nonlinear functions, show large simulation error when discretization sizes are too small in RDME systems. The switch-like Hill function reduces into a simple bimolecular mass reaction when the discretization size is smaller than a critical value in RDME framework. Convergent Hill function dynamics in RDME framework that maintains the switch behavior of Hill functions with fine discretization is proposed. Furthermore, the application of stochastic modeling and simulation techniques to the spatiotemporal regulatory network in Caulobacter crescentus is included. A stochastic model based on Turing pattern is exploited to demonstrate the bipolarization of a scaffold protein, PopZ, during Caulobacter cell cycle. In addition, the stochastic simulation of the spatiotemporal histidine kinase switch model captures the increased variability of cycle time in cells depleted of the divJ genes. / Ph. D.
6

Modeling Spatial Variability of Field-Scale Solute Transport in the Vadose Zone

Zacharias, Sebastian 08 October 1999 (has links)
Spatial heterogeneity in the soil system has a profound influence on the flow of water and chemicals in the unsaturated zone. Incorporating intrinsic soil variability and extrinsic variability into root zone leaching models will provide a better representation of pollutant distribution in natural field conditions. In this study, a stochastic framework (SF) was developed to represent spatial variability of soil properties in one-dimensional solute transport models, and implemented with two existing root zone leaching models, Opus and GLEAMS. The accuracy of soil water, bromide and pesticide transport predictions from Opus-SF and GLEAMS-SF was evaluated using field-measured soil water content, bromide and pesticide mass data from a 3.9-ha agricultural field in the Dougherty Plain of Georgia and a 0.05-ha field plot in Nomini Creek watershed in Virginia. Results from the rate-based Opus-SF and capacity-based GLEAMS-SF were compared to determine if there were significant differences in their predictions. In the stochastic approach, the heterogeneous field is conceptualized as a collection of vertical, non-interacting soil columns differing in soil properties. The horizontal variations of soil hydraulic and retention properties in each horizon are treated as random functions of zero transverse spatial correlation length, after accounting for any spatial trends. The spatially variable parameters were generated using the Latin hypercube sampling method, and the stochastic simulation of the model was performed using Monte-Carlo simulation techniques. Statistical tests indicated that Opus-SF and GLEAMS-SF did not predict the central tendency and distribution of depth-averaged soil water content and total pesticide mass observed in the field on most sampling dates. But their predictions were sufficiently accurate for most management-type applications. Soil hydraulic and retention properties derived from texture data at the Nomini Creek site substantially reduced the variability in soil water content predictions from both models, but had less impact on bromide and pesticide mass predictions from both models. The mean values predicted by Opus-SF and GLEAMS-SF were similar, but not equal to those predicted by the deterministic version of the models. Soil water and solute transport predictions from Opus-SF and GLEAMS-SF were not substantially different from corresponding results from the traditional Monte-Carlo approach, although soil water predictions from the two modeling approaches were significantly different for the first 150 days of simulation. Comparison between results from Opus-SF and GLEAMS-SF showed that the distributions and medians of soil water content predicted by the two models were significantly different on most sampling dates. The distributions and medians of pesticide mass predicted by the two models were closer than soil water content, but were significantly different on more than half of the field sampling dates. The more functional GLEAMS-SF model was able to simulate depth-averaged soil water content in the root zone better than the more physically based Opus-SF, although GLEAMS-SF was not able to simulate the depth distribution of soil water as accurately as Opus-SF. GLEAMS-SF was also able to predict solute movement at least as well as Opus-SF. GLEAMS-SF was able to simulate spatial variations of depth-averaged soil water content and pesticide mass in the field with reasonable accuracy employing fewer parameters that exhibit relatively lesser spatial variability. / Ph. D.
7

Random Vector Generation on Large Discrete Spaces

Shin, Kaeyoung 17 December 2010 (has links)
This dissertation addresses three important open questions in the context of generating random vectors having discrete support. The first question relates to the "NORmal To Anything" (NORTA) procedure, which is easily the most widely used amongst methods for general random vector generation. While NORTA enjoys such popularity, there remain issues surrounding its efficient and correct implementation particularly when generating random vectors having denumerable support. These complications stem primarily from having to safely compute (on a digital computer) certain infinite summations that are inherent to the NORTA procedure. This dissertation addresses the summation issue within NORTA through the construction of easily computable truncation rules that can be applied for a range of discrete random vector generation contexts. The second question tackled in this dissertation relates to developing a customized algorithm for generating multivariate Poisson random vectors. The algorithm developed (TREx) is uniformly fast—about hundred to thousand times faster than NORTA—and presents opportunities for straightforward extensions to the case of negative binomial marginal distributions. The third and arguably most important question addressed in the dissertation is that of exact nonparametric random vector generation on finite spaces. Specifically, it is wellknown that NORTA does not guarantee exact generation in dimensions higher than two. This represents an important gap in the random vector generation literature, especially in view of contexts that stipulate strict adherence to the dependency structure of the requested random vectors. This dissertation fully addresses this gap through the development of Maximum Entropy methods. The methods are exact, very efficient, and work on any finite discrete space with stipulated nonparametric marginal distributions. All code developed as part of the dissertation was written in MATLAB, and is publicly accessible through the Web site https://filebox.vt.edu/users/pasupath/pasupath.htm. / Ph. D.
8

Bridging the Gap between Deterministic and Stochastic Modeling with Automatic Scaling and Conversion

Wang, Pengyuan 17 June 2008 (has links)
During the past decade, many successful deterministic models of macromolecular regulatory networks have been built. Deterministic simulations of these models can show only average dynamics of the systems. However, stochastic simulations of macromolecular regulatory models can account for behaviors that are introduced by the noisy nature of the systems but not revealed by deterministic simulations. Thus, converting an existing model of value from the most common deterministic formulation to one suitable for stochastic simulation enables further investigation of the regulatory network. Although many different stochastic models can be developed and evolved from deterministic models, a direct conversion is the first step in practice. This conversion process is tedious and error-prone, especially for complex models. Thus, we seek to automate as much of the conversion process as possible. However, deterministic models often omit key information necessary for a stochastic formulation. Specifically, values in the model have to be scaled before a complete conversion, and the scaling factors are typically not given in the deterministic model. Several functionalities helping model scaling and converting are introduced and implemented in the JigCell modeling environment. Our tool makes it easier for the modeler to include complete details as well as to convert the model. Stochastic simulations are known for being computationally intensive, and thus require high performance computing facilities to be practical. With parallel computation on Virginia Tech's System X supercomputer, we are able to obtain the first stochastic simulation results for realistic cell cycle models. Stochastic simulation results for several mutants, which are thought to be biologically significant, are presented. Successful deployment of the enhanced modeling environment demonstrates the power of our techniques. / Master of Science
9

Stochastic Simulation of Reaction-Diffusion Processes

Hellander, Stefan January 2013 (has links)
Numerical simulation methods have become an important tool in the study of chemical reaction networks in living cells. Many systems can, with high accuracy, be modeled by deterministic ordinary differential equations, but other systems require a more detailed level of modeling. Stochastic models at either the mesoscopic level or the microscopic level can be used for cases when molecules are present in low copy numbers. In this thesis we develop efficient and flexible algorithms for simulating systems at the microscopic level. We propose an improvement to the Green's function reaction dynamics algorithm, an efficient microscale method. Furthermore, we describe how to simulate interactions with complex internal structures such as membranes and dynamic fibers. The mesoscopic level is related to the microscopic level through the reaction rates at the respective scale. We derive that relation in both two dimensions and three dimensions and show that the mesoscopic model breaks down if the discretization of space becomes too fine. For a simple model problem we can show exactly when this breakdown occurs. We show how to couple the microscopic scale with the mesoscopic scale in a hybrid method. Using the fact that some systems only display microscale behaviour in parts of the system, we can gain computational time by restricting the fine-grained microscopic simulations to only a part of the system. Finally, we have developed a mesoscopic method that couples simulations in three dimensions with simulations on general embedded lines. The accuracy of the method has been verified by comparing the results with purely microscopic simulations as well as with theoretical predictions. / eSSENCE
10

rstream: Streams of Random Numbers for Stochastic Simulation

L'Ecuyer, Pierre, Leydold, Josef January 2005 (has links) (PDF)
The package rstream provides a unified interface to streams of random numbers for the R statistical computing language. Features are: * independent streams of random numbers * substreams * easy handling of streams (initialize, reset) * antithetic random variates The paper describes this packages and demonstrates an simple example the usefulness of this approach. / Series: Preprint Series / Department of Applied Statistics and Data Processing

Page generated in 0.1346 seconds