• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 17558
  • 5457
  • 2960
  • 2657
  • 1693
  • 1640
  • 1013
  • 877
  • 762
  • 541
  • 306
  • 283
  • 279
  • 257
  • 175
  • Tagged with
  • 42219
  • 4330
  • 3915
  • 3756
  • 2861
  • 2490
  • 2415
  • 2310
  • 2143
  • 2020
  • 2011
  • 1951
  • 1949
  • 1926
  • 1864
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
651

Evaluation of GLEAMS considering parameter uncertainty

Clouse, Randy Wayne 04 September 2008 (has links)
A probabilistic procedure was applied to the evaluation of predictions from the GLEAMS nonpoint source pollution model. Assessment of both the procedure and model was made by comparing absolute and relative predictions made with both probabilistic and deterministic procedures. Field data used came from a study of pesticide fate and transport in both no-till and conventional tillage plots in a Coastal plain soil. Variables examined were: runoff, sediment yield, surface losses, mass in the root zone, and depth of center of mass for two pesticides and a tracer. Random inputs were characterized with probability distributions. Values for inputs were sampled from these distributions for 5000 model executions to create output distributions in the probabilistic procedure. Central tendency values from the probabilistic input distributions were used as inputs for the deterministic runs. Model predictions generally followed expected trends and were within observed variability. Two exceptions were systematic under-predictions of runoff and pesticide losses and under-predictions of the depth of bromide in the root zone later in the observed period. These exceptions may indicate errors in the runoff and plant uptake components of the model. Neither procedure made relative predictions correctly all the time, however subjective assessment of the model results led to consistent decisions between the two procedures. The probabilistic procedure reduced parameter uncertainty by eliminating arbitrary parameter selection from available data by utilizing the complete range of data, however, it did not eliminate uncertainty in the data itself. / Master of Science
652

Development and Validation of a Tool for In-Plane Antilock Braking System (ABS) Simulations

Khanse, Karan Rajiv 08 September 2015 (has links)
Automotive and Tire companies spend extensive amounts of time and money to tune their products through prototype testing at dedicated test facilities. This is mainly due to the limitations in the simulation capabilities that exist today. With greater competence in simulation, comes more control over designs in the initial stages, which in turn lowers the demand on the expensive stage of tuning. The work presented, aims at taking today's simulation capabilities a step forward by integrating models that are best developed in different software interfaces. An in-plane rigid ring model is used to understand the transient response of tires to various high frequency events such as Anti-Lock Braking and short wavelength road disturbances. A rule based ABS model performs the high frequency braking operation. The tire and ABS models have been created in the Matlab-Simulink environment. The vehicle model has been developed in CarSim. The models developed in Simulink have been integrated with the vehicle model in CarSim, in the form of a design tool that can be used by tire as well as vehicle designers for further tuning of the vehicle functional performances as they relate to in-line braking scenarios. Outdoor validation tests were performed to obtain data from a vehicle that was measured on a suspension parameter measuring machine (SPMM) in order to complement this design tool. The results of the objective tests performed have been discussed and the correlations and variations with respect to the simulation results have been analyzed. / Master of Science
653

MATLODE: A MATLAB ODE Solver and Sensitivity Analysis Toolbox

D'Augustine, Anthony Frank 04 May 2018 (has links)
Sensitivity analysis quantifies the effect that of perturbations of the model inputs have on the model's outputs. Some of the key insights gained using sensitivity analysis are to understand the robustness of the model with respect to perturbations, and to select the most important parameters for the model. MATLODE is a tool for sensitivity analysis of models described by ordinary differential equations (ODEs). MATLODE implements two distinct approaches for sensitivity analysis: direct (via the tangent linear model) and adjoint. Within each approach, four families of numerical methods are implemented, namely explicit Runge-Kutta, implicit Runge-Kutta, Rosenbrock, and single diagonally implicit Runge-Kutta. Each approach and family has its own strengths and weaknesses when applied to real world problems. MATLODE has a multitude of options that allows users to find the best approach for a wide range of initial value problems. In spite of the great importance of sensitivity analysis for models governed by differential equations, until this work there was no MATLAB ordinary differential equation sensitivity analysis toolbox publicly available. The two most popular sensitivity analysis packages, CVODES [8] and FATODE [10], are geared toward the high performance modeling space; however, no native MATLAB toolbox was available. MATLODE fills this need and offers sensitivity analysis capabilities in MATLAB, one of the most popular programming languages within scientific communities such as chemistry, biology, ecology, and oceanogra- phy. We expect that MATLODE will prove to be a useful tool for these communities to help facilitate their research and fill the gap between theory and practice. / Master of Science
654

The Influence of Cost-sharing Programs on Southern Non-industrial Private Forests

Goodwin, Christopher C. H. 11 January 2002 (has links)
This study was undertaken in response to concerns that the decreasing levels of funding for government tree planting cost share programs will result in significant reductions in non-industrial private tree planting efforts in the South. The purpose of this study is to quantify how the funding of various cost share programs, and market signals interact and affect the level of private tree planting. The results indicate that the ACP, CRP, and Soil Bank programs have been more influential than the FIP, FRM, FSP, SIP, and State run subsidy programs. Reductions in the CRP funding will result in less tree planting; while it is not clear that funding reductions in FIP, or other programs targeted toward reforestation after harvest, will have a negative impact on tree planting levels. / Master of Science
655

Application of Naturalistic Truck Driving Data to Analyze and Improve Car Following Models

Higgs, Bryan James 03 January 2012 (has links)
This research effort aims to compare car-following models when the models are calibrated to individual drivers with the naturalistic data. The models used are the GHR, Gipps, Intelligent Driver, Velocity Difference, Wiedemann, and the Fritzsche model. This research effort also analyzes the Wiedemann car-following model using car-following periods that occur at different speeds. The Wiedemann car-following model uses thresholds to define the different regimes in car following. Some of these thresholds use a speed parameter, but others rely solely upon the difference in speed between the subject vehicle and the lead vehicle. This research effort also reconstructs the Wiedemann car-following model for truck driver behavior using the Naturalistic Truck Driving Study's (NTDS) conducted by Virginia Tech Transportation Institute. This Naturalistic data was collected by equipping 9 trucks with various sensors and a data acquisition system. This research effort also combines the Wiedemann car-following model with the GHR car-following model for trucks using The Naturalistic Truck Driving Study's (NTDS) data. / Master of Science
656

Sequential Memory Generation For Cognitive Models

Sherwood, Eben Miles 01 June 2024 (has links) (PDF)
Understanding the process of memory formation in neural systems is of great interest in the field of neuroscience. Valiant’s Neuroidal Model poses a plausible theory for how memories are created within a computational context. Previously, the algorithm JOIN has been used to show how the brain could perform conjunctive and disjunctive coding to store memories. A limitation of JOIN is that it does not consider the coding of temporal information in a meaningful manner. We propose SeqMem, a similar algorithmic primitive that is designed to encode a series of items within a random graph model. We investigate the feasibility of SeqMem empirically by observing its stability and effects on capacity in our model. We intend to provide value in the use of SeqMem and similar procedures to further develop a neurobiologically plausible theory of mind. Our goal here is to inspire further work in scaling our methods to function at a human-level magnitude of computation.
657

SIR-models and uncertainty quantification

Jakobsson, Per Henrik, Wärnberg, Anton January 2024 (has links)
This thesis applies the theory of uncertainty quantification and sensitivity analysis on the SIR-model and SEIR-model for the spread of diseases. We attempt to determine if we can apply this theory to estimate the model parameters to an acceptable degree of accuracy.  Using sensitivity analysis we determine which parameters of the models are the most significant for some quantity of interest. We apply forward uncertainty quantification to determine how the uncertainty of the model parameters propagates to the quantities of interests. And lastly, we apply uncertainty quantification based on the maximum likelihood method to estimate the model parameters. To easily verify the results, we use synthetic data when estimating the parameters. After applying these methods we see that the importance of the model parameters heavily depend on the choice of quantity of interest. We also note that the uncertainty method reduces the uncertainty in the quantities of interests, although there are a lot of sources of errors that still needs to be considered.
658

Unsupervised learning of disease subtypes from continuous time Hidden Markov Models of disease progression

Gupta, Amrita 07 January 2016 (has links)
The detection of subtypes of complex diseases has important implications for diagnosis and treatment. Numerous prior studies have used data-driven approaches to identify clusters of similar patients, but it is not yet clear how to best specify what constitutes a clinically meaningful phenotype. This study explored disease subtyping on the basis of temporal development patterns. In particular, we attempted to differentiate infants with autism spectrum disorder into more fine-grained classes with distinctive patterns of early skill development. We modeled the progression of autism explicitly using a continuous-time hidden Markov model. Subsequently, we compared subjects on the basis of their trajectories through the model state space. Two approaches to subtyping were utilized, one based on time-series clustering with a custom distance function and one based on tensor factorization. A web application was also developed to facilitate the visual exploration of our results. Results suggested the presence of 3 developmental subgroups in the ASD outcome group. The two subtyping approaches are contrasted and possible future directions for research are discussed.
659

Jump-diffusion based-simulated expected shortfall (SES) method of correcting value-at-risk (VaR) under-prediction tendencies in stressed economic climate

Magagula, Sibusiso Vusi 05 1900 (has links)
Value-at-Risk (VaR) model fails to predict financial risk accurately especially during financial crises. This is mainly due to the model’s inability to calibrate new market information and the fact that the risk measure is characterised by poor tail risk quantification. An alternative approach which comprises of the Expected Shortfall measure and the Lognormal Jump-Diffusion (LJD) model has been developed to address the aforementioned shortcomings of VaR. This model is called the Simulated-Expected-Shortfall (SES) model. The Maximum Likelihood Estimation (MLE) approach is used in determining the parameters of the LJD model since it’s more reliable and authenticable when compared to other nonconventional parameters estimation approaches mentioned in other literature studies. These parameters are then plugged into the LJD model, which is simulated multiple times in generating the new loss dataset used in the developed model. This SES model is statistically conservative when compared to peers which means it’s more reliable in predicting financial risk especially during a financial crisis. / Statistics / M.Sc. (Statistics)
660

Modelling forest dynamics and management of natural tropical rain forests

Ramdass, Indarjit January 1987 (has links)
No description available.

Page generated in 0.1162 seconds