651 |
Visualization of the Budding Yeast Cell CycleCui, Jing 31 July 2017 (has links)
The cell cycle of budding yeast is controlled by a complex chemically reacting network of a large group of species, including mRNAs and proteins. Many mathematical models have been proposed to unravel its molecular mechanism. However, it is hard for people with less training to visually interpret the dynamics from the simulation results of these models. In this thesis, we use the visualization toolkit D3 and jQuery to design a web-based interface and help users to visualize the cell cycle simulation results. It is essentially a website where the proliferation of the wild-type and mutant cells can be visualized as dynamical animation. With the help of this visualization tool, we can easily and intuitively see many key steps in the budding yeast cell cycle procedure, such as bud emergence, DNA synthesis, mitosis, cell division, and the current populations of species. / Master of Science / The cell cycle of budding yeast is controlled by a complex chemically reacting network. Many mathematical models have been proposed to unravel its molecular mechanism. However, it is hard to visually interpret the dynamics from the simulation results of these models. In this thesis, we use the visualization toolkit D3 and jQuery to design a web-based interface and help users to visualize the cell cycle simulation results. It is essentially a webpage where the proliferation of the wild-type and mutant cells can be visualized as dynamical animation.
|
652 |
Evaluation of GLEAMS considering parameter uncertaintyClouse, Randy Wayne 04 September 2008 (has links)
A probabilistic procedure was applied to the evaluation of predictions from the GLEAMS nonpoint source pollution model. Assessment of both the procedure and model was made by comparing absolute and relative predictions made with both probabilistic and deterministic procedures. Field data used came from a study of pesticide fate and transport in both no-till and conventional tillage plots in a Coastal plain soil. Variables examined were: runoff, sediment yield, surface losses, mass in the root zone, and depth of center of mass for two pesticides and a tracer. Random inputs were characterized with probability distributions. Values for inputs were sampled from these distributions for 5000 model executions to create output distributions in the probabilistic procedure. Central tendency values from the probabilistic input distributions were used as inputs for the deterministic runs.
Model predictions generally followed expected trends and were within observed variability. Two exceptions were systematic under-predictions of runoff and pesticide losses and under-predictions of the depth of bromide in the root zone later in the observed period. These exceptions may indicate errors in the runoff and plant uptake components of the model. Neither procedure made relative predictions correctly all the time, however subjective assessment of the model results led to consistent decisions between the two procedures. The probabilistic procedure reduced parameter uncertainty by eliminating arbitrary parameter selection from available data by utilizing the complete range of data, however, it did not eliminate uncertainty in the data itself. / Master of Science
|
653 |
Development and Validation of a Tool for In-Plane Antilock Braking System (ABS) SimulationsKhanse, Karan Rajiv 08 September 2015 (has links)
Automotive and Tire companies spend extensive amounts of time and money to tune their products through prototype testing at dedicated test facilities. This is mainly due to the limitations in the simulation capabilities that exist today. With greater competence in simulation, comes more control over designs in the initial stages, which in turn lowers the demand on the expensive stage of tuning. The work presented, aims at taking today's simulation capabilities a step forward by integrating models that are best developed in different software interfaces. An in-plane rigid ring model is used to understand the transient response of tires to various high frequency events such as Anti-Lock Braking and short wavelength road disturbances. A rule based ABS model performs the high frequency braking operation. The tire and ABS models have been created in the Matlab-Simulink environment. The vehicle model has been developed in CarSim. The models developed in Simulink have been integrated with the vehicle model in CarSim, in the form of a design tool that can be used by tire as well as vehicle designers for further tuning of the vehicle functional performances as they relate to in-line braking scenarios. Outdoor validation tests were performed to obtain data from a vehicle that was measured on a suspension parameter measuring machine (SPMM) in order to complement this design tool. The results of the objective tests performed have been discussed and the correlations and variations with respect to the simulation results have been analyzed. / Master of Science
|
654 |
SIR-models and uncertainty quantificationJakobsson, Per Henrik, Wärnberg, Anton January 2024 (has links)
This thesis applies the theory of uncertainty quantification and sensitivity analysis on the SIR-model and SEIR-model for the spread of diseases. We attempt to determine if we can apply this theory to estimate the model parameters to an acceptable degree of accuracy. Using sensitivity analysis we determine which parameters of the models are the most significant for some quantity of interest. We apply forward uncertainty quantification to determine how the uncertainty of the model parameters propagates to the quantities of interests. And lastly, we apply uncertainty quantification based on the maximum likelihood method to estimate the model parameters. To easily verify the results, we use synthetic data when estimating the parameters. After applying these methods we see that the importance of the model parameters heavily depend on the choice of quantity of interest. We also note that the uncertainty method reduces the uncertainty in the quantities of interests, although there are a lot of sources of errors that still needs to be considered.
|
655 |
MATLODE: A MATLAB ODE Solver and Sensitivity Analysis ToolboxD'Augustine, Anthony Frank 04 May 2018 (has links)
Sensitivity analysis quantifies the effect that of perturbations of the model inputs have on the model's outputs. Some of the key insights gained using sensitivity analysis are to understand the robustness of the model with respect to perturbations, and to select the most important parameters for the model. MATLODE is a tool for sensitivity analysis of models described by ordinary differential equations (ODEs). MATLODE implements two distinct approaches for sensitivity analysis: direct (via the tangent linear model) and adjoint. Within each approach, four families of numerical methods are implemented, namely explicit Runge-Kutta, implicit Runge-Kutta, Rosenbrock, and single diagonally implicit Runge-Kutta. Each approach and family has its own strengths and weaknesses when applied to real world problems. MATLODE has a multitude of options that allows users to find the best approach for a wide range of initial value problems. In spite of the great importance of sensitivity analysis for models governed by differential equations, until this work there was no MATLAB ordinary differential equation sensitivity analysis toolbox publicly available. The two most popular sensitivity analysis packages, CVODES [8] and FATODE [10], are geared toward the high performance modeling space; however, no native MATLAB toolbox was available. MATLODE fills this need and offers sensitivity analysis capabilities in MATLAB, one of the most popular programming languages within scientific communities such as chemistry, biology, ecology, and oceanogra- phy. We expect that MATLODE will prove to be a useful tool for these communities to help facilitate their research and fill the gap between theory and practice. / Master of Science / Sensitivity analysis is the study of how small changes in a model?s input effect the model’s output. Sensitivity analysis provides tools to quantify the impact that small, discrete changes in input values have on the output. The objective of this research is to develop a MATLAB sensitivity analysis toolbox called MATLODE. This research is critical to a wide range of communities who need to optimize system behavior or predict outcomes based on a variety of initial conditions. For example, an analyst could build a model that reflects the performance of an automobile engine, where each part in the engine has a set of initial characteristics. The analyst can use sensitivity analysis to determine which part effects the engine’s overall performance the most (or the least), without physically building the engine and running a series of empirical tests. By employing sensitivity analysis, the analyst saves time and money, and since multiple tests can usually be run through the model in the time needed to run just one empirical test, the analyst is likely to gain deeper insight and design a better product. Prior to MATLODE, employing sensitivity analysis without significant knowledge of computational science was too cumbersome and essentially impractical for many of the communities who could benefit from its use. MATLODE bridges the gap between computational science and a variety of communities faced with understanding how small changes in a system’s input values effect the systems output; and by bridging that gap, MATLODE enables more large scale research initiatives than ever before.
|
656 |
Application of Naturalistic Truck Driving Data to Analyze and Improve Car Following ModelsHiggs, Bryan James 03 January 2012 (has links)
This research effort aims to compare car-following models when the models are calibrated to individual drivers with the naturalistic data. The models used are the GHR, Gipps, Intelligent Driver, Velocity Difference, Wiedemann, and the Fritzsche model. This research effort also analyzes the Wiedemann car-following model using car-following periods that occur at different speeds. The Wiedemann car-following model uses thresholds to define the different regimes in car following. Some of these thresholds use a speed parameter, but others rely solely upon the difference in speed between the subject vehicle and the lead vehicle. This research effort also reconstructs the Wiedemann car-following model for truck driver behavior using the Naturalistic Truck Driving Study's (NTDS) conducted by Virginia Tech Transportation Institute. This Naturalistic data was collected by equipping 9 trucks with various sensors and a data acquisition system. This research effort also combines the Wiedemann car-following model with the GHR car-following model for trucks using The Naturalistic Truck Driving Study's (NTDS) data. / Master of Science
|
657 |
Scope and limitations of the irreversible thermodynamics and the solution diffusion models for the separation of binary and multi-component systems in reverse osmosis processAl-Obaidi, Mudhar A.A.R., Kara-Zaitri, Chakib, Mujtaba, Iqbal 05 February 2017 (has links)
Yes / Reverse osmosis process is used in many industrial applications ranging from solute-solvent to solvent-solvent and gaseous separation. A number of theoretical models have been developed to describe the separation and fluxes of solvent and solute in such processes. This paper looks into the scope and limitations of two main models (the irreversible thermodynamics and the solution diffusion models) used in the past by several researchers for solute-solvent feed separation. Despite the investigation of other complex models, the simple concepts of these models accelerate the feasibility of the implementation of reverse osmosis for different types of systems and variety of industries. Briefly, an extensive review of these mathematical models is conducted by collecting more than 70 examples from literature in this study. In addition, this review has covered the improvement of such models to make them compatible with multi-component systems with consideration of concentration polarization and solvent-solute-membrane interaction.
|
658 |
Sequential Memory Generation For Cognitive ModelsSherwood, Eben Miles 01 June 2024 (has links) (PDF)
Understanding the process of memory formation in neural systems is of great interest in the field of neuroscience. Valiant’s Neuroidal Model poses a plausible theory for how memories are created within a computational context. Previously, the algorithm JOIN has been used to show how the brain could perform conjunctive and disjunctive coding to store memories. A limitation of JOIN is that it does not consider the coding of temporal information in a meaningful manner. We propose SeqMem, a similar algorithmic primitive that is designed to encode a series of items within a random graph model. We investigate the feasibility of SeqMem empirically by observing its stability and effects on capacity in our model. We intend to provide value in the use of SeqMem and similar procedures to further develop a neurobiologically plausible theory of mind. Our goal here is to inspire further work in scaling our methods to function at a human-level magnitude of computation.
|
659 |
Unsupervised learning of disease subtypes from continuous time Hidden Markov Models of disease progressionGupta, Amrita 07 January 2016 (has links)
The detection of subtypes of complex diseases has important implications for diagnosis and treatment. Numerous prior studies have used data-driven approaches to identify clusters of similar patients, but it is not yet clear how to best specify what constitutes a clinically meaningful phenotype. This study explored disease subtyping on the basis of temporal development patterns. In particular, we attempted to differentiate infants with autism spectrum disorder into more fine-grained classes with distinctive patterns of early skill development. We modeled the progression of autism explicitly using a continuous-time hidden Markov model. Subsequently, we compared subjects on the basis of their trajectories through the model state space. Two approaches to subtyping were utilized, one based on time-series clustering with a custom distance function and one based on tensor factorization. A web application was also developed to facilitate the visual exploration of our results. Results suggested the presence of 3 developmental subgroups in the ASD outcome group. The two subtyping approaches are contrasted and possible future directions for research are discussed.
|
660 |
Jump-diffusion based-simulated expected shortfall (SES) method of correcting value-at-risk (VaR) under-prediction tendencies in stressed economic climateMagagula, Sibusiso Vusi 05 1900 (has links)
Value-at-Risk (VaR) model fails to predict financial risk accurately especially during financial crises. This is mainly due to the model’s inability to calibrate new market information and the fact that the risk measure is characterised by poor tail risk quantification. An alternative
approach which comprises of the Expected Shortfall measure and the Lognormal Jump-Diffusion (LJD) model has been developed to address the aforementioned shortcomings of VaR. This model is called the Simulated-Expected-Shortfall (SES) model. The Maximum Likelihood Estimation (MLE) approach is used in determining the parameters of the LJD model since it’s more reliable and authenticable when compared to other nonconventional parameters estimation approaches mentioned in other literature studies. These parameters are then plugged into the LJD model, which is simulated multiple times in generating the new loss dataset used in the developed model. This SES model is statistically
conservative when compared to peers which means it’s more reliable in predicting financial risk especially during a financial crisis. / Statistics / M.Sc. (Statistics)
|
Page generated in 0.104 seconds