• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 54
  • 43
  • 11
  • 8
  • 4
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 274
  • 274
  • 82
  • 78
  • 67
  • 55
  • 46
  • 35
  • 33
  • 32
  • 32
  • 31
  • 30
  • 29
  • 27
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
211

Conservation Implications of a Marbled Salamander, Ambystoma opacum, Metapopulation Model

Plunkett, Ethan B 01 January 2009 (has links) (PDF)
Amphibians are in decline globally and a significantly greater percentage of ambystomatid salamander species are in decline relative to other species; habitat loss contributes significantly to this decline. The goals of this thesis is to better understand extinction risk in a marbled salamander (ambystoma opacum) population and how forestry effects extinction risk. To achieve this goal we first estimated an important life history parameter (Chapter 1) then used a metapopulation model to estimate population viability and determine what aspects of their life history put them most at risk (Chapter 2) and finally predicted extinction risk in response to hypothetical forestry scenarios (Chapter 3). In Chapter 1 we estimated one of the requisite parameters for the model, juvenile survival, based on 8 years of field data. We estimated juvenile survival probabilities (to first breeding) at 17% for males and 11% for females. To our knowledge, these are the first estimates for marbled salamanders that include both returning and dispersing individuals. In Chapter 2 we used a metapopulation model to estimate extinction risk and sensitivity of extinction risk to changes in vital rates and other model parameters. We found that although there is considerable uncertainty in our estimate it is likely that extinction risk is low at our study site. Sensitivity analysis revealed that small changes in adult survival lead to relatively large changes in persistence and the presence of an apparent threshold in reproductive failure probabilities beyond which extinction risk rapidly increased. In Chapter 3 we used the extinction risk and sensitivity estimates to model the effects of forestry on the metapopulation. We parameterized several different levels of impact of forestry on salamander survival; for each parameterization we calculated the extinction risk for 20 different forestry scenarios involving buffer size (30 to 300 meters) and complete or partial restrictions on cutting (5 different levels). We found for all but the most optimistic parameterizations large buffers (around 200 meters) with high restrictions on cutting within the buffer were necessary to maintain a low extinction risk. Overall we show that although the population at our intensively studied field site is unlikely to go extinct under present conditions small decreases in adult survival, small increases in catastrophe rate, and intensive forestry can all make extinction likely.
212

Dynamic Model Pooling Methodology for Improving Aberration Detection Algorithms

Sellati, Brenton J 01 January 2010 (has links) (PDF)
Syndromic surveillance is defined generally as the collection and statistical analysis of data which are believed to be leading indicators for the presence of deleterious activities developing within a system. Conceptually, syndromic surveillance can be applied to any discipline in which it is important to know when external influences manifest themselves in a system by forcing it to depart from its baseline. Comparing syndromic surveillance systems have led to mixed results, where models that dominate in one performance metric are often sorely deficient in another. This results in a zero-sum trade off where one performance metric must be afforded greater importance for a decision to be made. This thesis presents a dynamic pooling technique which allows for the combination of competing syndromic surveillance models in such a way that the resulting detection algorithm offers a superior combination of sensitivity and specificity, two of the key model metrics, than any of the models individually. We then apply this methodology to a simulated data set in the context of detecting outbreaks of disease in an animal population. We find that this dynamic pooling methodology is robust in the sense that it is capable of superior overall performance with respect to sensitivity, specificity, and mean time to detection under varying conditions of baseline data behavior, e.g. controlling for the presence or absence of various levels of trend and seasonality, as well as in simulated out-of-sample performance tests.
213

Incorporating Shear Resistance Into Debris Flow Triggering Model Statistics

Lyman, Noah J 01 December 2020 (has links) (PDF)
Several regions of the Western United States utilize statistical binary classification models to predict and manage debris flow initiation probability after wildfires. As the occurrence of wildfires and large intensity rainfall events increase, so has the frequency in which development occurs in the steep and mountainous terrain where these events arise. This resulting intersection brings with it an increasing need to derive improved results from existing models, or develop new models, to reduce the economic and human impacts that debris flows may bring. Any development or change to these models could also theoretically increase the ease of collection, processing, and implementation into new areas. Generally, existing models rely on inputs as a function of rainfall intensity, fire effects, terrain type, and surface characteristics. However, no variable in these models directly accounts for the shear stiffness of the soil. This property when considered with the respect to the state of the loading of the sediment informs the likelihood of particle dislocation, contractive or dilative volume changes, and downslope movement that triggers debris flows. This study proposes incorporating shear wave velocity (in the form of slope-based thirty-meter shear wave velocity, Vs30) to account for this shear stiffness. As commonly used in seismic soil liquefaction analysis, the shear stiffness is measured via shear wave velocity which is the speed of the vertically propagating horizontal shear wave through sediment. This spatially mapped variable allows for broad coverage in the watersheds of interest. A logistic regression is used to then compare the new variable against what is currently used in predictive post-fire debris flow triggering models. Resulting models indicated improvement in some measures of statistical utility through receiver operating characteristic curves (ROC) and threat score analysis, a method of ranking models based on true/false positive and negative results. However, the integration of Vs30 offers similar utility to current models in additional metrics, suggesting that this input can benefit from further refinement. Further suggestions are additionally offered to further improve the use of Vs30 through in-situ measurements of surface shear wave propagation and integration into Vs30 datasets through a possible transfer function. Additional discussion into input variables and their impact on created models is also included.
214

Forecasting COVID-19 with Temporal Hierarchies and Ensemble Methods

Shandross, Li 09 August 2023 (has links) (PDF)
Infectious disease forecasting efforts underwent rapid growth during the COVID-19 pandemic, providing guidance for pandemic response and about potential future trends. Yet despite their importance, short-term forecasting models often struggled to produce accurate real-time predictions of this complex and rapidly changing system. This gap in accuracy persisted into the pandemic and warrants the exploration and testing of new methods to glean fresh insights. In this work, we examined the application of the temporal hierarchical forecasting (THieF) methodology to probabilistic forecasts of COVID-19 incident hospital admissions in the United States. THieF is an innovative forecasting technique that aggregates time-series data into a hierarchy made up of different temporal scales, produces forecasts at each level of the hierarchy, then reconciles those forecasts using optimized weighted forecast combination. While THieF's unique approach has shown substantial accuracy improvements in a diverse range of applications, such as operations management and emergency room admission predictions, this technique had not previously been applied to outbreak forecasting. We generated candidate models formulated using the THieF methodology, which differed by their hierarchy schemes and data transformations, and ensembles of the THieF models, computed as a mean of predictive quantiles. The models were evaluated using weighted interval score (WIS) as a measure of forecast skill, and the top-performing subset was compared to several benchmark models. These models included simple ARIMA and seasonal ARIMA models, a naive baseline model, and an ensemble of operational incident hospitalization models from the US COVID-19 Forecast Hub. The THieF models and THieF ensembles demonstrated improvements in WIS and MAE, as well as competitive prediction interval coverage, over many benchmark models for both the validation and testing phases. The best THieF model generally ranked second out of nine total models during the testing evaluation. These accuracy improvements suggest the THieF methodology may serve as a useful addition to the infectious disease forecasting toolkit.
215

Proteomics and Machine Learning for Pulmonary Embolism Risk with Protein Markers

Awuah, Yaa Amankwah 01 December 2023 (has links) (PDF)
This thesis investigates protein markers linked to pulmonary embolism risk using proteomics and statistical methods, employing unsupervised and supervised machine learning techniques. The research analyzes existing datasets, identifies significant features, and observes gender differences through MANOVA. Principal Component Analysis reduces variables from 378 to 59, and Random Forest achieves 70% accuracy. These findings contribute to our understanding of pulmonary embolism and may lead to diagnostic biomarkers. MANOVA reveals significant gender differences, and applying proteomics holds promise for clinical practice and research.
216

Developing a precision agriculture framework to assess financial viability of decisions in farming and conservation

Sublett, Jennifer 08 December 2023 (has links) (PDF)
Agricultural producers are invested in managing the impacts of crop damage on their yields and profit. When damage occurs early enough in an agricultural growing season, farmers have the option to replant their corn stand in an effort to recoup some of the lost profits. In this thesis two different types of naturally occurring damage, wildlife depredation and persistent weed or insect patches, were simulated on two representative regions of Mississippi. These data were then used to assess the financial viability of a range of damage mitigation methods, including partial replanting, enrollment into a government conservation buffer, and no action. Replanting was demonstrated to be generally the most economically viable method of management across all simulation scenarios. This analysis showed a lower return on conservation enrollment than expected, indicating that an increase in financial benefits for some conservation programs may be warranted.
217

Mixed effects modelling for biological systems

Yu, Zhe Si 05 1900 (has links)
En raison des relations complexes entre les variables des systèmes biologiques, l’hétérogénéité des données biologiques pose un défi pour leur modélisation par des modèles mathématiques et statistiques. En réponse, étant conçus pour traiter des données multiniveaux et bruitées, les modèles à effets mixtes deviennent de plus en plus populaires en modélisation quantitative de systèmes biologiques. L'objectif de cette thèse est de présenter l’application de modèles à effets mixtes à différents systèmes biologiques. Le deuxième chapitre de ce mémoire vise à déterminer la relation entre la cote de qualité du sirop d'érable, divers indicateurs de qualité couramment obtenus par les producteurs ainsi qu'un nouvel indicateur, le COLORI, et la concentration en acides aminés (AA). Pour cela, nous avons créé deux modèles à effets mixtes : le premier est un modèle ordinal qui prédit directement la cote de qualité du sirop d'érable en utilisant la transmittance, COLORI et AA ; le deuxième modèle est un modèle non linéaire qui prédit la concentration en AA en utilisant COLORI avec le pH comme approximation temporelle. Nos résultats montrent que la concentration en AA est un bon prédicteur de la qualité du sirop d'érable et que COLORI est un bon prédicteur de la concentration en AA. Le troisième chapitre traite de l’utilisation d’un modèle de la pharmacocinétique de population (PopPK) pour décrire la dynamique de l'estradiol dans un modèle de pharmacologie quantitative des systèmes (QSP) de la différenciation des cellules mammaires en cellules myoépithéliales afin de capturer l'hétérogénéité de la population de patients. Nous avons trouvé que la composante PopPK du modèle QSP n’a pas ajoutée de grande variation dans la dynamique de patients virtuels, ce qui suggère que le modèle QSP inclut intrinsèquement l'hétérogénéité. Dans l'ensemble, ce mémoire démontre l'application de modèles à effets mixtes au systèmes biologiques pour comprendre l'hétérogénéité des données biologiques. / Modelling biological systems with mathematical models has been a challenge due to the tendency for biological data to be heavily heterogeneous with complex relationships between the variables. Mixed effects models are an increasingly popular choice as a statistical model for biological systems since it is designed for multilevel data and noisy data. The aim of this thesis is to showcase the range of usage of mixed effects modelling for different biological systems. The second chapter aims to determine the relationship between maple syrup quality rating and various quality indicator commonly obtained by producers as well as a new indicator, COLORI, and amino acid (AA) concentration. For this, we created two mixed effects models: the first is an ordinal model that directly predicts maple syrup quality rating using transmittance, COLORI and AA; the second model is a nonlinear model that predicts AA concentration using COLORI with pH as a time proxy. Our models show that AA concentration is a good predictor for maple syrup quality, and COLORI is a good predictor for AA concentration. The third chapter involves using a population pharmacokinetics (PopPK) model to estimate estradiol dynamics in a quantitative systems pharmacokinetics (QSP) model for mammary cell differentiation into myoepithelial cells in order to capture population heterogeneity among patients. Our results show that the QSP model inherently includes heterogeneity in its structure since the added PopPK estradiol portion of the model does not add large variation in the estimated virtual patients. Overall, this thesis demonstrates the application of mixed effects models in biology as a way to understand heterogeneity in biological data.
218

MONITORING AUTOCORRELATED PROCESSES

Tang, Weiping 10 1900 (has links)
<p>This thesis is submitted by Weiping Tang on August 2, 2011.</p> / <p>Several control schemes for monitoring process mean shifts, including cumulative sum (CUSUM), weighted cumulative sum (WCUSUM), adaptive cumulative sum (ACUSUM) and exponentially weighted moving average (EWMA) control schemes, display high performance in detecting constant process mean shifts. However, a variety of dynamic mean shifts frequently occur and few control schemes can efficiently work in these situations due to the limited window for catching shifts, particularly when the mean decreases rapidly. This is precisely the case when one uses the residuals from autocorrelated data to monitor the process mean, a feature often referred to as forecast recovery. This thesis focuses on detecting a shift in the mean of a time series when a forecast recovery dynamic pattern in the mean of the residuals is observed. Specifically, we examine in detail several particular cases of the Autoregressive Integrated Moving Average (ARIMA) time series models. We introduce a new upper-sided control chart based on the Exponentially Weighted Moving Average (EWMA) scheme combined with the Fast Initial Response (FIR) feature. To assess chart performance we use the well-established Average</p> <p>Run Length (ARL) criterion. A non-homogeneous Markov chain method is developed for ARL calculation for the proposed chart. We show numerically that the proposed procedure performs as well or better than the Weighted Cumulative Sum (WCUSUM) chart introduced by Shu, Jiang and Tsui (2008), and better than the conventional CUSUM, the ACUSUM and the Generalized Likelihood Ratio Test (GLRT) charts. The methods are illustrated on molecular weight data from a polymer manufacturing process.</p> / Master of Science (MSc)
219

Towards a Microsimulation Residential Housing Market Model: Real Estate Appraisal and New Housing Development

Liu, Xudong 10 1900 (has links)
<p>As a mid-size industrial city in North America, the City of Hamilton has been increasingly experiencing urban sprawl in the past six decades coupled with population growth and economic development. The study of various interdependent processes driving the evolution of urban form requires the application of simulation models that offer urban planners and policy-makers an efficient means for evaluating urban development policies. This thesis focuses on the modeling efforts towards building a microsimulation residential housing market system for the City of Hamilton. To this end, two major tasks have been conducted in this research. First, a state-of-the-art agent-based microsimulation housing market framework has been designed. Second, two model components in the microsimulation framework, namely a real estate appraisal model and a new housing development model, have been estimated. The objective of the real estate appraisal model is to assess the market values of existing dwellings based on the housing transactions in the previous period. Thre e model forms, including a traditional hedonic model, a spatial regression model, and a regression Kriging model, have been employed in estimations for comparison purposes. A series of independent variables that describe the characteristics of dwelling, location, and neighborhood are specified in the explanatory model. The comparisons among estimation results demonstrate that the spatial regression model has achieved a higher goodness-of-fit than the traditional hedonic model. In addition, we verified that spatial autocorrelation is present in the residuals of the traditional hedonic model, which is explicitly captured by the spatial regression model. In terms of model prediction accuracy, spatial models (SAR and Kriging) both achieve a certain level of improvements over the traditional hedonic model. Overall, we end up recommending that the SAR model is more appropriate to be incorporated into the microsimulation framework, as it provides the best match between predicted and observed values. The new housing development model enables the development of a dynamic housing supply module in the simulation framework by modeling the location and type decisions during the housing development process for each year. A parcel -level two-tier nested-logit model has been estimated. The model is able to deal with not only the decision to develop a specific vacant residential land parcel, but also the development type choice. In terms of the factors influencing the decision to develop, the picture revealed from the model estimation results is that land developers are more likely to start a development project in greenfields than in brownfields. As for the type choice decision during the development process, a variety of variables describing transportation accessibility, residential amenities, the characteristics of the land parcel and neighborhood are included in the model specifications.</p> / Master of Arts (MA)
220

INFERENCE FOR ONE-SHOT DEVICE TESTING DATA

Ling, Man Ho 10 1900 (has links)
<p>In this thesis, inferential methods for one-shot device testing data from accelerated life-test are developed. Due to constraints on time and budget, accelerated life-tests are commonly used to induce more failures within a reasonable amount of test-time for obtaining more lifetime information that will be especially useful in reliability analysis. One-shot devices, which can be used only once as they get destroyed immediately after testing, yield observations only on their condition and not on their real lifetimes. So, only binary response data are observed from an one-shot device testing experiment. Since no failure times of units are observed, we use the EM algorithm for determining the maximum likelihood estimates of the model parameters. Also, inference for the reliability at a mission time and the mean lifetime at normal operating conditions are also developed.</p> <p>The thesis proceeds as follows. Chapter 2 considers the exponential distribution with single-stress relationship and develops inferential methods for the model parameters, the reliability and the mean lifetime. The results obtained by the EM algorithm are compared with those obtained from the Bayesian approach. A one-shot device testing data is analyzed by the proposed method and presented as an illustrative example. Next, in Chapter 3, the exponential distribution with multiple-stress relationship is considered and corresponding inferential results are developed. Jackknife technique is described for the bias reduction in the developed estimates. Interval estimation for the reliability and the mean lifetime are also discussed based on observed information matrix, jackknife technique, parametric bootstrap method, and transformation technique. Again, we present an example to illustrate all the inferential methods developed in this chapter. Chapter 4 considers the point and interval estimation for the one-shot device testing data under the Weibull distribution with multiple-stress relationship and illustrates the application of the proposed methods in a study involving the development of tumors in mice with respect to risk factors such as sex, strain of offspring, and dose effects of benzidine dihydrochloride. A Monte Carlo simulation study is also carried out to evaluate the performance of the EM estimates for different levels of reliability and different sample sizes. Chapter 5 describes a general algorithm for the determination of the optimal design of an accelerated life-test plan for one-shot device testing experiment. It is based on the asymptotic variance of the estimated reliability at a specific mission time. A numerical example is presented to illustrate the application of the algorithm. Finally, Chapter 6 presents some concluding remarks and some additional research problems that would be of interest for further study.</p> / Doctor of Philosophy (PhD)

Page generated in 0.1321 seconds