121 |
Monte Carlo simulation with parametric and nonparametric analysis of covariance for nonequivalent control groupsBender, Mary January 1987 (has links)
There are many parametric statistical models that have been designed to measure change in nonequivalent control group studies, but because of assumption violations and potential artifacts, there is no one form of analysis that always appears to be appropriate. While the parametric analysis of covariance and parametric ANCOVAS with a covariate correction are some of the more frequently completed analyses used in nonequivalent control group research, comparative studies with nonparametric counterparts should be completed and results compared with those more commonly used forms of analysis.
The current investigation studied and compared the application of four ANCOVA models: the parametric, the covariate-corrected parametric, the rank transform, and the covariate-corrected rank transform. Population parameters were established; sample parameter intervals determined by Monte Carlo simulation were examined; and a best ANCOVA model was systematically and theoretically determined in light of population assumption violations, reliability of the covariate correction, the width of the sample probability level intervals, true parent population parameters, and results of robust regression.
Results of data exploration on the parent population revealed that, based on assumptions, the covariate-corrected ANCOVAS are preferred over both the parametric and rank analyses. A reliability coefficient of ṟ=.83 also indicated that a covariate-corrected ANCOVA is effective in error reduction. Robust regression indicated that the outliers in the data set impacted the regression equation for both parametric models, and deemed selection of either model questionable.
The tightest probability level interval for the samples serves to delineate the model with the greatest convergence of probability levels, and, theoretically, the most stable model. Results of the study indicated that, because the covariate-corrected rank ANCOVA had by far the tightest interval, it is the preferred model. In addition, the probability level interval of the covariate-corrected rank model is the only model interval that contained the true population parameter.
Results of the investigation clearly indicate that the covariate-corrected rank ANCOVA is the model of choice for this nonequivalent control group study. While its use has yet to be reported in the literature, the covariate-corrected rank analysis of covariance provides a viable alternative for researchers who must rely upon intact groups for the answers to their research questions. / Ph. D.
|
122 |
Evaluating the influence of establishing pine forests and switchgrass fields on local and global climateAhlswede, Benjamin James 18 May 2021 (has links)
Humans have extensively altered terrestrial surfaces through land-use and land-cover change. This change has resulted in increased food, fiber, fuel, and wood that is provisioned by ecosystems to support the human population. Unfortunately, the change has also altered climate through carbon emissions and changes in the surface energy balance. Consequently, maximizing both the provisioning and climate regulation services provided by terrestrial ecosystems is a grand challenge facing a growing global population living in a changing climate. The planting of pine forests for timber and carbon storage and switchgrass fields for bioenergy are two land-cover types that can potentially be used for climate mitigation. Importantly, both are highly productive systems representing contrasts in albedo (grass are brighter than pines) and vegetation height (pines are taller than the grass) along with unknown differences in carbon and water balance that influence local to global climate. Here I use eddy-covariance data to investigate how a transition from a perennial bioenergy crop (switchgrass) to a planted pine plantation alters the local surface temperature, global carbon dioxide concentrations, and global energy balance. First, I found that switchgrass and pine ecosystems have very similar local surface temperatures, especially during the grass growing season. After the switchgrass is harvested, surface temperature in the pine forest is much lower than switchgrass because no vegetation is present to facilitate the evaporation of water. The surface temperature in a bare-ground system (a recent clear-cut) was also high relative to the pine and pre-harvest switchgrass ecosystems. This illustrates the importance of maintaining vegetation cover to reduce local surface temperature. Second, I found that the 30-year mean change in global energy balance (i.e., radiative forcing) from planting a pine ecosystem rather than a switchgrass field was positive (pine warms climate) when considering changes in albedo and carbon measured using eddy-covariance systems. When including harvested carbon, pine and switchgrass can have similar global radiative forcing if all harvested pine carbon is stored, but harvested switchgrass carbon is burned. However, no scenarios I explored resulted in a strong negative radiative forcing by the pine ecosystem relative to the switchgrass field. These results show that afforestation or reforestation in the eastern United States may not result in any climate benefit over planting a switchgrass field. However, the presence of vegetation in both ecosystem types offers a clear benefit by cooling local surface temperatures. / Doctor of Philosophy / Humans are changing the Earth's climate by using oil and gas as fuel that emits greenhouse gases, mainly carbon dioxide, into the atmosphere. Planting trees to reestablish forests is a natural solution for climate change because forests absorb carbon dioxide from the air, but reforestation also changes the Earth's climate in other ways. For example, forests are generally darker than crops and grasses and absorb more sunlight, which traps energy in the atmosphere that can warm global temperature. These non-carbon effects can potentially offset the climate benefit from absorbed carbon dioxide. An alternative natural climate solution is to replace oil and gas with fuels derived from plants, known as bioenergy. Here I compared the local and global climate influence of a tree plantation (loblolly pine) to a bioenergy crop (switchgrass). I found that the local temperature of pine and switchgrass were similar in the summer when the grass was growing, and both were cooler than bare-ground, which was unable to evaporate and transpire water to the atmosphere. Over 30 years, I found that pine and switchgrass absorb similar amounts of carbon. The pine forest absorbs more carbon than switchgrass when it is fully grown but releases carbon during the first five years of growth. As a switchgrass field is brighter than a pine forest, planting a pine forest instead of a switchgrass field warms the Earth's climate. However, assuming no carbon from the harvested trees is released to the atmosphere, the pine and switchgrass have the same influence on global climate. My findings show that a pine plantation and a bioenergy crop can have similar climate benefits when carbon is stored in forests.
|
123 |
Thinning of point processes-covariance analysesChandramohan, Jagadeesh January 1982 (has links)
This dissertation addresses a class of problems in point process theory called 'thinning'. By thinning we mean an operation whereby a point process is split into two point processes by some rule. We obtain the covariance structure between the thinned processes under various thinning rules. We first obtain this structure for independent Bernoulli thinning of an arbitrary point process. We show that if the point process is a renewal (stationary or ordinary) process, the thinned processes will be uncorrelated if and only if the renewal process is Poisson in which case the thinned processes are independent. Thus, we have a situation where zero correlation implies independence. We also show that while the intervals between events in the thinned processes may be uncorrelated, the counts need not be.
Next, we obtain the covariance structure between the thinned processes resulting from a mark dependent thinning of a Markov renewal process with a Polish mark space. These results are used to study the overflow queue where we show that in equilibrium the input and overflow processes are uncorrelated as are the output and overflow processes. We thus provide an example where two uncorrelated but dependent renewal processes, neither of which is Poisson but which produce a Poisson process when superposed.
Next, we study Bernoulli thinning of an alternating Markov process and show that the thinned process are uncorrelated if and only if the process is Poisson in which case the thinned processes are independent. Finally, we obtain the covariance structure obtained when a renewal process is thinned by an independent Markov chain. We show that if the renewal process is Poisson and the chain is stationary, the thinned processes will be uncorrelated if and only if the Markov chain is a Bernoulli process. We also show how to design the chain to obtain a pre-specified covariance function.
We then close the dissertation by summarizing the work presented and indicating areas of further research including a short discussion of the use of Laplace functionals in the determination of joint distributions of thinned processes. / Ph. D.
|
124 |
Missing values in covariance in the case of the randomized blockShannon, Catherine January 1948 (has links)
The formula and theory for estimating a missing value in the case of covariance in a randomized block has been presented in this paper. It has also been found that the formula given corresponds to Yates’ formula for a missing value in a randomized block when there is only one variable present in the experiment. / Master of Science
|
125 |
Combining Data-driven and Theory-guided Models in Ensemble Data AssimilationPopov, Andrey Anatoliyevich 23 August 2022 (has links)
There once was a dream that data-driven models would replace their theory-guided counterparts. We have awoken from this dream. We now know that data cannot replace theory. Data-driven models still have their advantages, mainly in computational efficiency but also providing us with some special sauce that is unreachable by our current theories. This dissertation aims to provide a way in which both the accuracy of theory-guided models, and the computational efficiency of data-driven models can be combined. This combination of theory-guided and data-driven allows us to combine ideas from a much broader set of disciplines, and can help pave the way for robust and fast methods. / Doctor of Philosophy / As an illustrative example take the problem of predicting the weather. Typically a supercomputer will run a model several times to generate predictions few days into the future. Sensors such as those on satellites will then pick up observations about a few points on the globe, that are not representative of the whole atmosphere. These observations are combined, ``assimilated'' with the computer model predictions to create a better representation of our current understanding of the state of the earth. This predict-assimilate cycle is repeated every day, and is called (sequential) data assimilation. The prediction step traditional was performed by a computer model that was based on rigorous mathematics. With the advent of big-data, many have wondered if models based purely on data would take over. This has not happened. This thesis is concerned with taking traditional mathematical models and running them alongside data-driven models in the prediction step, then building a theory in which both can be used in data assimilation at the same time in order to not have a drop in accuracy and have a decrease in computational cost.
|
126 |
Comparaison d'approches d'ajustement pour les facteurs confondants dans le cadre d'études observationnelles à l'aide de données administrativesBenasseur, Imane 27 January 2024 (has links)
Les méthodes du score de propension (PS) sont populaires pour estimer l’effet d’une exposition sur une issue à l’aide de données observationnelles. Cependant, leur mise en place pour l’analyse de données administratives soulève des questions concernant la sélection des covariables confondantes et le risque de covariables confondantes non mesurées. Notre objectif principal consiste à comparer différentes approches d’ajustement pour les covariables confondantes pou réliminer les biais dans le cadre d’études observationnelles basées sur des données administratives. Quatre méthodes de sélection de covariables seront comparées à partir de simulations, à savoir le score de propension à hautes dimensions (hdPS), le score de propension à hautes dimensions modifié (hdPS_0), le LASSO adapté pour l’issue (OAL) et l’estimation ciblée collaborative et évolutive par maximum de vraisemblance (SC-TMLE). Pour hdPS, hdPS_0et OAL, quatre approches d’ajustement sont considérées : 1) la pondération par l’inverse de probabilité de traitement (IPTW), 2) l’appariement, 3) l’appariement pondéré et 4) l’ajustement pour le score de propension. Des simulations avec 1000 observations et 100 covariables potentiellement confondantes ont été réalisées. Les résultats indiquent que la performance des méthodes d’ajustement varie d’un scénario à l’autre, mais l’IPTW, a réussi globalement à réduire le plus le biais et l’erreur quadratique moyenne parmi toutes les méthodes d’ajustement. De surcroît, aucune des méthodes de sélection de covariables ne semble vraiment arriver à corriger le biais attribuable aux covariables confondantes non mesurées. Enfin, la robustesse de l’algorithme hdPS peut être beaucoup améliorée, en éliminant l’étape 2 (hdPS_0). / Propensity score methods (PS) are common for estimating the effect of an exposure on an outcome using observational data. However, when analyzing administrative data, the applicationof PS methods raises questions regarding how to select confounders, and how to adjust forunmeasured ones. Our objective is to compare different methods for confounding adjustmentin the context of observational studies based on administrative data. Four methods for selecting confounders were compared using simulation studies: High-dimensional propensity score(hdPS), modified hdPS (hdPS_0), Outcome-adaptive Lasso (OAL) and Scalable collaborativetargeted maximum likelihood (SC-TMLE). For hdPS, hdPS_0 and OAL, four PS adjustmentapproaches were considered : 1) inverse probability of treatment weighting (IPTW), 2) matching, 3) matching weights and 4) covariate adjustment in the outcome model. Simulations basedon synthetically generated data with 1000 observations and 100 potential confounders havebeen realized. The results indicate that the performance of the adjustment methods variesfrom one scenario to another, but the IPTW, has globally succeeded in reducing the bias andthe mean square error. Moreover, no method for selecting confounders, seem to be able toadjust for unmeasured confounders. Finally, the robustness of the hdPS algorithm can begreatly improved, by eliminating step 2 (hdPS_0).
|
127 |
Tabela de covariância : um mapeamento rápido e automático de continuidade espacialKloeckner, Jonas January 2018 (has links)
Os modelos de covariância são ferramentas geoestatísticas essenciais para mapear a continuidade espacial. A abordagem atual busca um modelo de continuidade espacial lícito com mínima ou até mesmo sem nenhuma interferência do usuário. Alinhado a essa visão moderna, é proposto obter uma tabela de covariância que visa substituir na prática o modelo tradicional explicitamente definido de covariância. Essa tabela de covariância é obtida por meio de três etapas: interpolar o conjunto de dados para preencher um grid regular, aplicar a convolução através do algoritmo da transformada rápida de Fourier e, por fim, transformar de volta para o domínio espacial. O modelo base para extrair covariância representa o ponto chave comparando com os métodos anteriores que propuseram o uso da tabela de covariância. Os resultados são satisfatórios, tanto na validação estatística do método, quanto na rapidez de obtenção de uma análise de continuidade espacial. Um estudo de caso tridimensional ilustra a aplicação prática através de krigagem e simulação geoestatística em comparação com a modelagem espacial tradicional. / Covariance models are essential geostatistical tools to map spatial continuity. The current approach pursues a licit spatial continuity model with minimum or even no user interference. Aligned with this modern view we propose to obtain a covariance table that aims at replacing in practice traditional covariance explicit defined model. This covariance table is obtained through a three steps work flow: interpolating the dataset to fill up a regular grid, auto convolute via Fast Fourier Transform algorithm and back transform to spacial domain. The base model to extract covariance represents the turning point comparing with previous methods that proposed covariance table usage. The results are satisfactory, both in the statistical validation of the method and in the speed of obtaining a spatial continuity analysis. A three dimensional case study illustrates the practical application for kriging and geostatistical simulation in comparison with traditional spatial modeling.
|
128 |
A STUDY ON THE DCC-GARCH MODEL’S FORECASTING ABILITY WITH VALUE-AT-RISK APPLICATIONS ON THE SCANDINAVIAN FOREIGN EXCHANGE MARKETAndersson-Säll, Tim, Lindskog, Johan January 2019 (has links)
This thesis has treated the subject of DCC-GARCH model’s forecasting ability and Value-at- Risk applications on the Scandinavian foreign exchange market. The estimated models were based on daily opening foreign exchange spot rates in the period of 2004-2013, which captured the information in the financial crisis of 2008 and Eurozone crisis in the early 2010s. The forecasts were performed on a one-day rolling window in 2014. The results show that the DCC-GARCH model accurately predicted the fluctuation in the conditional correlation, although not with the correct magnitude. Furthermore, the DCC-GARCH model shows good Value-at-Risk forecasting performance for different portfolios containing the Scandinavian currencies.
|
129 |
Caractérisation géométrique d’organes tubulaires / Geometrical analysis of tubular organsGrelard, Florent 04 December 2017 (has links)
Cette thèse s’inscrit dans un contexte d’analyse d’images médicales et s’attache plus particulièrement à l’étude des organes tubulaires (vaisseaux, bronches, neurones...). De nombreuses pathologies, telles que les bronchopneumopathies chroniques obstructives (BPCO) pour les bronches, affectent la structure des organes tubulaires. Des outils automatiques sont nécessaires afin d’effectuer des mesures précises, reproductibles et permettant un suivi dans le temps. L’objectif de la thèse est de proposer de nouveaux outils facilitant la caractérisation automatique de ces organes. Cette caractérisation se base le plus souvent sur l’analyse de la géométrie de l’organe segmenté. Il existe de nombreuses méthodes permettant d’effectuer des mesures géométriques, mais certaines nécessitent d’extraire des sous-représentations de l’organe. Dans le cadre de cette thèse, nous nous intéressons à deux de celles-ci : le squelette curvilinéaire et les plans de coupe 2D reconstruits orthogonalement à l’axe de l’organe. Notre première contribution est une méthode d’estimation précise des plans orthogonaux. Nous avons également proposé un algorithme de squelettisation spécifique aux organes tubulaires, ainsi que des outils d’amélioration de squelettes existants. De plus, nous présentons à la fin de ce document, un travail en cours sur une méthode de filtrage de structures tubulaires à partir d’images en niveaux de gris. / This thesis is dedicated to the study of tubular organs, such as blood vessels, bronchii, or neurons. Several diseases, such as chronic obstructive pulmonary disease (COPD) for bronchii, distort the structure of tubular organs. Several tools are necessary in order to make precise and reproducible measurements and to track the organ over time. The main goal of this thesis is to define new tools for the automatic characterization of tubular organs. This characterization is based on the geometrical analysis from a segmentation. There are various state-of-the-art methods to perform geometrical measurements, but some of them require to extract alternative representations from the organ. In this thesis, we focus on two of these : curvilinear skeletons and 2D planes computed from the axis of the tube. Our first contribution is a method for the precise estimation of orthogonal planes. In addition, we propose a skeletonization algorithm as well as methods to improve existing skeletons. At the end of this manuscript, we introduce a method from a work in progress which allows to filter tubular structures in grayscale images.
|
130 |
Online machine learning methods for visual tracking / Algorithmes d'apprentissage en ligne pour le suivi visuelQin, Lei 05 May 2014 (has links)
Nous étudions le problème de suivi de cible dans une séquence vidéo sans aucune connaissance préalable autre qu'une référence annotée dans la première image. Pour résoudre ce problème, nous proposons une nouvelle méthode de suivi temps-réel se basant sur à la fois une représentation originale de l’objet à suivre (descripteur) et sur un algorithme adaptatif capable de suivre la cible même dans les conditions les plus difficiles comme le cas où la cible disparaît et réapparait dans le scène (ré-identification). Tout d'abord, pour la représentation d’une région de l’image à suivre dans le temps, nous proposons des améliorations au descripteur de covariance. Ce nouveau descripteur est capable d’extraire des caractéristiques spécifiques à la cible, tout en ayant la capacité à s’adapter aux variations de l’apparence de la cible. Ensuite, l’étape algorithmique consiste à mettre en cascade des modèles génératifs et des modèles discriminatoires afin d’exploiter conjointement leurs capacités à distinguer la cible des autres objets présents dans la scène. Les modèles génératifs sont déployés dans les premières couches afin d’éliminer les candidats les plus faciles alors que les modèles discriminatoires sont déployés dans les couches suivantes afin de distinguer la cibles des autres objets qui lui sont très similaires. L’analyse discriminante des moindres carrés partiels (AD-MCP) est employée pour la construction des modèles discriminatoires. Enfin, un nouvel algorithme d'apprentissage en ligne AD-MCP a été proposé pour la mise à jour incrémentale des modèles discriminatoires / We study the challenging problem of tracking an arbitrary object in video sequences with no prior knowledge other than a template annotated in the first frame. To tackle this problem, we build a robust tracking system consisting of the following components. First, for image region representation, we propose some improvements to the region covariance descriptor. Characteristics of a specific object are taken into consideration, before constructing the covariance descriptor. Second, for building the object appearance model, we propose to combine the merits of both generative models and discriminative models by organizing them in a detection cascade. Specifically, generative models are deployed in the early layers for eliminating most easy candidates whereas discriminative models are in the later layers for distinguishing the object from a few similar "distracters". The Partial Least Squares Discriminant Analysis (PLS-DA) is employed for building the discriminative object appearance models. Third, for updating the generative models, we propose a weakly-supervised model updating method, which is based on cluster analysis using the mean-shift gradient density estimation procedure. Fourth, a novel online PLS-DA learning algorithm is developed for incrementally updating the discriminative models. The final tracking system that integrates all these building blocks exhibits good robustness for most challenges in visual tracking. Comparing results conducted in challenging video sequences showed that the proposed tracking system performs favorably with respect to a number of state-of-the-art methods
|
Page generated in 0.0721 seconds