141 |
Statistical modelling of data from insect studies / Modelagem estatística de dados provenientes de estudos em entomologiaMoral, Rafael de Andrade 19 December 2017 (has links)
Data from insect studies may present different features. Univariate responses may be analyzed using generalized linear models (continuous and discrete data), survival models (time until event data), mixed effects models (longitudinal data), among other methods. These models may be used to analyse data from experiments which assess complex ecological processes, such as competition and predation. In that sense, computational tools are useful for researchers in several fields, e.g., insect biology and physiology, applied ecology and biological control. Using different datasets from entomology as motivation, as well as other types of datasets for illustration purposes, this work intended to develop new modelling frameworks and goodness-of-fit assessment tools. We propose accelerated failure rate mixed models with simultaneous location and scale modelling with regressors to analyse time-until-attack data from a choice test experiment. We use the exponential, Weibull and exponentiated-Weibull models, and assess goodness-of-fit using half-normal plots with simulation envelopes. These plots are the subject of an entire Chapter on an R package, called hnp, developed to implement them. We use datasets from different types of experiments to illustrate the use of these plots and the package. A bivariate extension to the N-mixture modelling framework is proposed to analyse longitudinal count data for two species from the same food web that may interact directly or indirectly, and example datasets from ecological studies are used. An advantage of this modelling framework is the computation of an asymmetric correlation coefficient, which may be used by ecologists to study the degree of association between species. The jointNmix R package was also developed to implement the estimation process for these models. Finally, we propose a goodness-of-fit assessment tool for bivariate models, analogous to the half-normal plot with a simulation envelope, and illustrate the approach with simulated data and insect competition data. This tool is also implemented in an R package, called bivrp. All software developed in this thesis is made available freely on the Comprehensive R Archive Network. / Dados provenientes de estudos com insetos podem apresentar características diferentes. Respostas univariadas podem ser analisadas utilizando-se modelos lineares generalizados (dados contínuos e discretos), modelos de análise de sobrevivência (dados de tempo até ocorrência de um evento), modelos de efeitos mistos (dados longitudinais), dentre outros métodos. Esses modelos podem ser usados para analisar dados provenientes de experimentos que avaliam processos ecológicos complexos, como competição e predação. Nesse sentido, ferramentas computacionais são úteis para pesquisadores em diversos campos, por exemplo, biologia e fisiologia de insetos, ecologia aplicada e controle biológico. Utilizando diferentes conjuntos de dados entomológicos como motivação, assim como outros tipos de dados para ilustrar os métodos, este trabalho teve como objetivos desenvolver novos modelos e ferramentas para avaliar a qualidade do ajuste. Foram propostos modelos de tempo de vida acelerado mistos, com modelagem simultânea dos parâmetros de locação e de escala com regressores, para analisar dados de tempo até ataque de um experimento que avaliou escolha de predadores. Foram utilizados modelos exponencial, Weibull e Weibull-exponenciado, e a qualidade do ajuste foi avaliada utilizando gráficos meio-normais com envelope de simulação. Esses gráficos são o assunto de um Capítulo inteiro sobre um pacote para o software R, chamado hnp, desenvolvido para implementá-los. Foram utilizados conjuntos de dados de diferentes tipos de experimentos para ilustrar o uso desses gráficos e do pacote. Uma extensão bivariada para os modelos chamados \"N-mixture\" foi proposta para analisar dados longitudinais de contagem para duas espécies pertencentes à mesma teia trófica, que podem interagir direta e indiretamente, e conjuntos de dados provenientes de estudos ecológicos são usados para ilustrar a abordagem. Uma vantagem dessa estratégica de modelagem é a obtenção de um coeficiente de correlação assimétrico, que pode ser utilizado por ecologistas para inferir acerca do grau de associação entre espécies. O pacote jointNmix foi desenvolvido para implemetar o processo de estimação para esses modelos. Finalmente, foi proposta uma ferramenta de avaliação de qualidade do ajuste para modelos bivariados, análoga ao gráfico meio-normal com envelope de simulação, e a metodologia _e ilustrada com dados simulados e dados de competição de insetos. Essa ferramenta está também implementada em um pacote para o R, chamado bivrp. Todo o software desenvolvido nesta tese está disponível, gratuitamente, na Comprehensive R Archive Network (CRAN).
|
142 |
EM algorithm for Markov chains observed via Gaussian noise and point process information: Theory and case studiesDamian, Camilla, Eksi-Altay, Zehra, Frey, Rüdiger January 2018 (has links) (PDF)
In this paper we study parameter estimation via the Expectation Maximization (EM) algorithm for a continuous-time hidden Markov model with diffusion and point process observation. Inference problems of this type arise for instance in credit risk modelling. A key step in the application of the EM algorithm is the derivation of finite-dimensional filters for the quantities that are needed in the E-Step of the algorithm. In this context we obtain exact, unnormalized and robust filters, and we discuss their numerical implementation. Moreover, we propose several goodness-of-fit tests for hidden Markov models with Gaussian noise and point process observation. We run an extensive simulation study to test speed and accuracy of our methodology. The paper closes with an application to credit risk: we estimate the parameters of a hidden Markov model for credit quality where the observations consist of rating transitions and credit spreads for US corporations.
|
143 |
ASSESSING THE MODEL FIT OF MULTIDIMENSIONAL ITEM RESPONSE THEORY MODELS WITH POLYTOMOUS RESPONSES USING LIMITED-INFORMATION STATISTICSLi, Caihong Rosina 01 January 2019 (has links)
Under item response theory, three types of limited information goodness-of-fit test statistics – M2, Mord, and C2 – have been proposed to assess model-data fit when data are sparse. However, the evaluation of the performance of these GOF statistics under multidimensional item response theory (MIRT) models with polytomous data is limited. The current study showed that M2 and C2 were well-calibrated under true model conditions and were powerful under misspecified model conditions. Mord were not well-calibrated when the number of response categories was more than three. RMSEA2 and RMSEAC2 are good tools to evaluate approximate fit.
The second study aimed to evaluate the psychometric properties of the Religious Commitment Inventory-10 (RCI-10; Worthington et al., 2003) within the IRT framework and estimate C2 and its RMSEA to assess global model-fit. Results showed that the RCI-10 was best represented by a bifactor model. The scores from the RCI-10 could be scored as unidimensional notwithstanding the presence of multidimensionality. Two-factor correlational solution should not be used. Study two also showed that religious commitment is a risk factor of intimate partner violence, whereas spirituality was a protecting factor from the violence. More alcohol was related with more abusive behaviors. Implications of the two studies were discussed.
|
144 |
多維列聯表離群細格的偵測研究 / Identification of Outlying Cells in Cross-Classified Tables陳佩妘, Chen, Pei-Yun Unknown Date (has links)
在處理列聯表時,適合度檢定的結果如果是顯著的話,則意味著配適的模式並不恰當,這其中一個可能的原因是資料中存在離群細格.因此我們希望能夠針對問題癥結所在,找出離群細格,使得我們的資料可以利用一個比較簡單且容易解釋的模式來做分析.在這篇論文中,我們主要依據施苑玉[1995]所提出的方法作些許的改變,使得改進後的方法可以適用於三維列聯表的所有情形.此外我們也將 Simonoff 在1988年所提出的方法,以及 BMDP 統計軟體的程序 4F ,與我們所提出的方法相比較.由模擬實驗的結果可發現我們的方法比前述兩種方法更具可行性. / When fitting a loglinear model to a contingency table, a significant goodness-of-fit can be resulted because of the existence of a few outlyingcells. Since a simpler model is easier to interpret and conveys more easilyunderstood information about a table than a complicated one, we would liketo identify those outliers so that a simpler model would fit a given data set. In this research, a modification of Shih's [1995] procedure is provided, and the revised method is now applicable to any type of models related tothree-way tables. Some data sets are simulated to compare outliers detectedusing procedures proposed by Simonoff [1988], and BMDP program 4F with our proposed method. Based on the results through simulation, our revised procedure outperforms the other two procedures most
of the time.
|
145 |
Axiological InvestigationsOlson, Jonas January 2005 (has links)
<p>The subject of this thesis is <i>formal axiology</i>, i.e., the discipline that deals with structural and conceptual questions about value. The main focus is on <i>intrinsic</i> or <i>final</i> value. The thesis consists of an introduction and six free-standing essays. The purpose of the introduction is to give a general background to the discussions in the essays. The introduction is divided into five sections. Section 1 outlines the subject matter and sketches the methodological framework. Section 2 discusses the supervenience of value, and how my use of that notion squares with the broader methodological framework. Section 3 defends the concept of intrinsic or final value. Section 4 discusses issues in value typology; particularly how intrinsic value relates to final value. Section 5 summarises the essays and provides some specific backgrounds to their respective themes.</p><p>The six essays are thematically divided into four categories: The first two deal with specific issues concerning analyses of value. Essay 1 is a comparative discussion of competing approaches in this area. Essay 2 discusses, and proposes a solution to, a significant problem for the so called ‘buck-passing’ analysis of value. Essay 3 discusses the ontological nature of the bearers of final value, and defends the view that they are particularised properties, or <i>tropes</i>. Essay 4 defends <i>conditionalism</i> about final value, i.e., the idea that final value may vary according to context. The last two essays focus on some implications of the formal axiological discussion for normative theory: Essay 5 discusses the charge that the buck-passing analysis prematurely resolves the debate between consequentialism and deontology; essay 6 suggests that conditionalism makes possible a reconciliation between consequentialism and moral particularism. </p>
|
146 |
General conditional linear models with time-dependent coefficients under censoring and truncationTeodorescu, Bianca 19 December 2008 (has links)
In survival analysis interest often lies in the relationship between the survival function and a certain number of covariates. It usually happens that for some individuals we cannot observe the event of interest, due to the presence of right censoring and/or left truncation. A typical example is given by a retrospective medical study, in which one is interested in the time interval between birth and death due to a certain disease. Patients who die of the disease at early age will rarely have entered the study before death and are therefore left truncated. On the other hand, for patients who are alive at the end of the study, only a lower bound of the true survival time is known and these patients are hence right censored.
In the case of censored and/or truncated responses, lots of models exist in the literature that describe the relationship between the survival function and the covariates (proportional hazards model or Cox model, log-logistic model, accelerated failure time model, additive risks model, etc.). In these models, the regression coefficients are usually supposed to be constant over time. In practice, the structure of the data might however be more complex, and it might therefore be better to consider coefficients that can vary over time. In the previous examples, certain covariates (e.g. age at diagnosis, type of surgery, extension of tumor, etc.) can have a relatively high impact on early age survival, but a lower influence at higher age. This motivated a number of authors to extend the Cox model to allow for time-dependent coefficients or consider other type of time-dependent coefficients models like the additive hazards model.
In practice it is of great use to have at hand a method to check the validity of the above mentioned models.
First we consider a very general model, which includes as special cases the above mentioned models (Cox model, additive model, log-logistic model, linear transformation models, etc.) with time-dependent coefficients and study the parameter estimation by means of a least squares approach. The response is allowed to be subject to right censoring and/or left truncation.
Secondly we propose an omnibus goodness-of-fit test that will test if the general time-dependent model considered above fits the data. A bootstrap version, to approximate the critical values of the test is also proposed.
In this dissertation, for each proposed method, the finite sample performance is evaluated in a simulation study and then applied to a real data set.
|
147 |
Axiological InvestigationsOlson, Jonas January 2005 (has links)
The subject of this thesis is formal axiology, i.e., the discipline that deals with structural and conceptual questions about value. The main focus is on intrinsic or final value. The thesis consists of an introduction and six free-standing essays. The purpose of the introduction is to give a general background to the discussions in the essays. The introduction is divided into five sections. Section 1 outlines the subject matter and sketches the methodological framework. Section 2 discusses the supervenience of value, and how my use of that notion squares with the broader methodological framework. Section 3 defends the concept of intrinsic or final value. Section 4 discusses issues in value typology; particularly how intrinsic value relates to final value. Section 5 summarises the essays and provides some specific backgrounds to their respective themes. The six essays are thematically divided into four categories: The first two deal with specific issues concerning analyses of value. Essay 1 is a comparative discussion of competing approaches in this area. Essay 2 discusses, and proposes a solution to, a significant problem for the so called ‘buck-passing’ analysis of value. Essay 3 discusses the ontological nature of the bearers of final value, and defends the view that they are particularised properties, or tropes. Essay 4 defends conditionalism about final value, i.e., the idea that final value may vary according to context. The last two essays focus on some implications of the formal axiological discussion for normative theory: Essay 5 discusses the charge that the buck-passing analysis prematurely resolves the debate between consequentialism and deontology; essay 6 suggests that conditionalism makes possible a reconciliation between consequentialism and moral particularism.
|
148 |
Resultatpåverkan av olika fördelningar på parametern operationstid vid simuleringsstudier. : <html /> / <html /> : <html />Bengtsson, Angelica, Kuc, Arlena January 2011 (has links)
I detta arbete har studerats hur flödet i en flerstegs- bearbetningsprocess påverkas av stokastiska fluktuationer och störningar i de enskilda processtegen. Mera bestämt har analys utförts av hur de stokastiska variationerna i operationstiderna kan och bör modelleras vid simuleringsstudier. Även hur påverkan av valet av sådana stokastiska modeller kan tänkas ha på processen i sin helhet, till exempel avseende total genomloppstid. Examensarbetet syftar till att undersöka hur val av fördelning på parametern operationstid, påverkar resultatfaktorn genomloppstid vid flödessimuleringar. För att finna svar på denna påverkan har en fallstudie utförts, med utgångspunkt av indata från en produkt som tillverkas på Volvo Aero. Denna produkt genomgår en tillverkningssekvens innehållande 18 stycken bearbetningsoperationer innefattande tre olika processtyper (automatisk, halvautomatisk och manuell). Dessa tre processtyper är i olika grad beroende av operatörers insats. De 18 bearbetningsoperationernas processtid har analyserats numeriskt och grafiskt. Programvaran Stat:fit har använts som hjälpmedel för att erhålla svar på lämplig fördelning per tillverkningsoperation samt vilka teoretiska fördelningar som är lämpliga att använda för de tre olika processtyperna. De rekommenderade fördelningsteorierna per tillverkningsoperation har genomgått fördelningstest (Chi2, Kolmogorov-Smirnov och Anderson-Darling) och använts som grund vid skapande av försöksplan till simuleringsstudien. Simuleringsstudien har utförts enligt försöksplan i programvaran Simul8. Samtliga körningar från simuleringsmodellen är statistiskt säkerställda med 95 % konfidensintervall. Fallstudien har visat att resultatpåverkan från operationstidernas fördelningstyp är relativt liten vid simulering av komplexa system där faktorer som nivå av tillverkningsvolym och tillgänglighet har större påverkan på resultatfaktorn genomloppstid. Vid enklare modeller utan begränsning i form av reducerad tillgänglighet synliggörs skillnad i simuleringsresultat av olika val av fördelning på parametern operationstid. Fördelningen av dessa simuleringsresultat styrks av den centrala gränsvärdessatsen, det vill säga att om antalet observerade värden är tillräckligt stort, uppträder resultatet som normalfördelat. / Discrete event simulation is used to imitate and analyze how systems change over time. The actual behavior of the variation in the system is interpreted by using discrete and continuous probability distributions. In the software program Simul8, simulation models are created based on the information collected from the production. Shifts, operation time and efficiency are examples of information required for the modeling process. The aim with this bachelor´s thesis was to investigate how different choice of probability distributions on the parameter operation time affects the result of a discrete event simulation. The thesis is a result of a case study performed at Volvo Aero Corporation, Sweden. The case study involves investigation of probability distribution for 18 manufacturing operations for a product. The manufacturing sequence consists of three different types of processes (automatic, semiautomatic and manual). These three types of processes need different level of instrumentality. The commercial statistical computer software, Stat:fit has been used to find proper probability distribution for each of the manufacturing operations. The results from Stat:fit have been used to analyze if there are any connections between the process type and the probability distributions. The recommended probability distributions have been tested with Goodness-of-fit tests (Chi2, Kolmogorov-Smirnov and Anderson-Darling) using Stat:fit and used in the simulation modeling. The simulation model has been validated and verified by a simulation advisor at Volvo Aero. Five different simulation models have been evaluated in Simul8, with five different types of distributions. All simulation runs have been statistical proved, in Simul8 with 95% confidence interval. The result of this study indicates that the variation of process time has limited effect for complex simulation models containing low level of efficiency and high load factors, concerning the result of throughput time. For simple models, excluded from restricted efficiency, the effect on the throughput time is featured.
|
149 |
A study of Li¡VTsai thoughtLi, Ning-yu 17 August 2012 (has links)
Li¡VTsai, is a important person of the rationalism in the Middle and late Ming Dynasty. After middle Ming Dynasty, conscience lost the practice of the early aims. People to save the drawback of the Wang's descendants, have filed a new theory to correct the decadent atmosphere, one of the governance drawback of the descendants is the Substance of Nature to be raised. They not only integrate of both Chu's and Wang's though, also according to personal thinking of moral theories to breakthrough the new and review Wang's criterion, also to breakthrough Chu's doctrine system of limitations. Transfer of the Substance of Nature ideologue on the moral principles to chien¡V lo establishing leave Wang's category, appear to the moral theories by the Substance of Mind were turned to Substance of Nature of the rationalism in the late Ming Dynasty, and reflect the complex intertwined politics and society of the current situation and academic thought depravity seeking new response of Late Ming Dynasty. chien¡V lo according to Great Learning propose the doctrine of chih ¡V hsiu, mind¡Bconscience of Wang's though are classified as acquired,and classified the mind as Substance of Nature headed by the system. Unlike Wang's study of mind, is the ontological basis and cause Tong¡VLin resonance for the rescue Wang's correction movement in the positive direction of the late Ming Dynasty.The scholar of Substance of Nature headed are integrate of both Chu's and Wang's though , relies their own thinking of the founding the current situation and academic atmosphere to a new way, and, indirectly, reflect the pulse of the academic atmosphere of the Late Ming Dynasty. Li chien¡V lo stand in it , the chih¡Vhsiu theory has its leading role in the academic trend
of the late Ming.
|
150 |
Ένας έλεγχος καλής προσαρμογής για συνεχείς δισδιάστατες κατανομέςΑλεξόπουλος, Ανδρέας 06 November 2007 (has links)
Η παρούσα διπλωματική εργασία αντλεί την θεματολογία της από την θεωρία ελέγχων καλής προσαρμογής. Δίνονται τα βασικά σημεία της θεωρίας ελεγχοσυναρτήσεων και στη συνέχεια παρουσιάζεται η επέκταση του έλεγχου των Kolmogorov-Smirnov στο διδιάστατο χώρο καθώς και μια τροποποίησή της. Βασικό βοήθημα για την επέκταση αυτή αποτελεί το θεώρημα του Rosenblatt, το οποίο προτείνει ένα μετασχηματισμό μιας απόλυτα συνεχούς k-διάστατης κατανομής σε ομοιόμορφη κατανομή στον k-διάστατο υπερκύβο. Παρουσιάζεται επίσης το στατιστικό Α, το οποίο προτάθηκε από τον Damico. Η ιδιαιτερότητα αυτού του στατιστικού είναι ότι έχει διακριτή κατανομή.
Προτείνεται ένα στατιστικό για τον έλεγχο καλής προσαρμογής συνεχών δεδομένων αρχικά στις δύο και στη συνέχεια στις k διαστάσεις. Ως εργαλεία χρησιμοποιήθηκαν το στατιστικό Α και το Θεώρημα του Rosenblatt. Για διάφορα μεγέθη δείγματος, δίνονται ο πίνακας πιθανοτήτων για τις τιμές του στατιστικού καθώς και ο πίνακας με τις κρίσιμες τιμές για διάφορες τιμές του p-value. Οι πίνακες αυτοί προέκυψαν κυρίως με μεθόδους προσομοίωσης. Τέλος, υπολογίστηκε η ισχύς του ελέγχου και γίνεται σύγκριση με την ισχύ του διδιάστατου Kolmogorov-Smirnov. / This project is based in theory of goodness-fit-tests. We present the most important componenets of test funcion theory. Also, we present the extension of the Kolmogorov-Smirnov test in bivariate case and an approximation. This extension is based on Rosenblatt's theorem, which suggests a transformation of an absolutly continious k-variate distribution into the uniform distribution of the k-dimentional hypercube. Moreover, is presented the statistic A, which was suggested from Damico. The particularity of this statistic is that has a district contribution.
We suggest a goodnes-of-fit test for continious data first on two dimensions and after on k dimensions. This new statistic uses Rosenblatt's transformation and the statistic A. For different sizes of sample, are given the table of probablities and the table with the critical values. These tables were arised with simulation methods. Finally, was computed the power of the test and was compared with the power of the bivariate Kolmogorv-Smirnov.
|
Page generated in 0.0553 seconds