1 |
Volatility Forecasting of Crude Oil Future¡ÐUnder Normal Mixture Model and NIG Mixture ModelWu, Chia-ying 30 May 2012 (has links)
This study attempts to capture the behavior of volatility in the commodity futures market by importing the normal mixture GARCH Model and the NIG mixture GARCH model (Normal-inverse Gaussian Mixture GARCH Model). Normal mixture GARCH Model (what follows called NM-GARCH Model) is a model mixed by two to several normal distributions with a specific weight portfolio, and its variance abide by GAECH process. The ability of capturing the financial data with leptokurtosis and fat-tail of NM-GARCH Model is better than Normal GARCH Model and Student¡¦s t GARCH Model.¡CAlso¡AThe Variance of the factor with lower weight in NM-GARCH Model usually higher, and the volatility of the factor with higher weight is lower, which explains the situation happens in the real market that the probability of large fluctuations (shocks) is small, and the probability of small fluctuations are higher. Generally, the volatilities which keeping occurring in common cases are respectively flat, and the shocks usually bring large impacts but less frequent.
NIG Mixture Distribution is a distribution mixed by two to several weighted distributions, and the distribution of every factor abides by NIG Distribution. Compare to Normal Mixture Distribution, NIG Mixture Distribution takes the advantages of NIG Distribution into account, which can not only explain leptokurtosis and the deviation of data, but describe the fat-tail phenomenon more complete as well, because of the both tails of NIG Distribution decreasing slowly.
This study will apply the NM GARCH Model and NIG GARCH Model to the Volatility forecasting of the return rates in the crude oil futures market, and infer the predictive abilities of this two kinds of models are significantly better than other volatility model by implementing parameter estimation, forecasting, loss function and statistic significant test.
|
2 |
Modification of mesoporous silicasSheikh, Shehla Altaf January 2000 (has links)
No description available.
|
3 |
Statistical analysis of seasonality in sudden infant death syndromeMooney, Jennifer Anne January 2002 (has links)
SIDS deaths exhibit a seasonal pattern with a winter peak, and the cause of this seasonality is unknown. The seasonal pattern is not symmetrical and it has been thought that the relatively flat winter peak may be due to the existence of more than one underlying population, where each population corresponds to a different cause of seasonality. In this thesis, mixtures of von Mises distributions have been fitted using maximum likelihood estimation to determine whether there is heterogeneity in the UK SIDS data. Various computational problems arise with the fitting procedures and attempts to tackle these for the SIDS data are discussed. A bootstrap likelihood ratio method is used to assess the number of components in the mixture, and its properties are investigated by simulation. Changes in the seasonal pattern since the 'back to sleep' campaign are also examined as any differences might give clues as to what caused the fall in 1992, and what the reasons for the remaining deaths might be. The von Mises distributions are compared with cosinor analysis and skewed regression models to determine the most appropriate method for modelling the seasonality in the data. Mixtures of Weibull and Gamma distributions are used to model the age distribution in SIDS. The motivation for this was to determine whether there are two or more groups of babies whose age-at-death distributions are different and to examine any changes since the 'back to sleep' campaign. Generalised linear models have previously been used to determine whether month of birth is an independent risk factor in addition to month of death and age at death. In this thesis, mixtures of these generalised linear models have been fitted using the EM algorithm to determine whether there are different groups of babies with different risks. Childhood type 1 diabetes mellitus is another condition which exhibits a seasonal pattern in diagnosis. The thesis concludes by considering analysis of these data using the mixture modelling approach.
|
4 |
Statistical inference on a mixture model屠烈偉, Tao, Lit-wai. January 1993 (has links)
published_or_final_version / Applied Statistics / Master / Master of Social Sciences
|
5 |
Application of Mixture Theory to solid tumors and normal pressure hydrocephalusBurazin, Andrijana 09 December 2013 (has links)
In this thesis, the theory of poroelasticity, namely the Mixture Theory version -- a homogenized, macroscopic scale approach used to describe fluid flow through a porous medium -- is employed in three separate cases pertaining to a biological phenomenon. The first investigation explores the behavior of interstitial fluid pressure (IFP) in solid tumors. Thus, in Chapter 2, a Mixture Theory based approach is developed to describe the evolution of the IFP from that in a healthy interstitium to the elevated levels in cancerous tumors. Attention is focused on angiogenesis, a tightly regulated process in healthy tissue that provides all necessary nutrients through the creation of new blood vessels. Once this process becomes unruly within a tumor, angiogenesis gives rise to an abnormal vasculature by forming convoluted and leaky blood vessels. Thus, the primary focus of the model is on the capillary filtration coefficient and vascular density as they increase in time, which in turn elevates the tumor IFP. Later, the Mixture Theory model is extended to simulate the effects of vascular normalization, where the cancer therapy not only prunes blood vessels, but reverts the chaotic vasculature to a somewhat normal state, thereby temporarily lowering the tumor IFP. In Chapter 3, the validity of an assumption that was made in order to facilitate the mathematical calculations is investigated. In addition to all of the Mixture Theory assumptions, it is assumed that the pore pressure p is proportional to the tissue dilatation e. This assumption is examined to determine how appropriate and accurate it is, by using a heat type equation without the presence of sources and sinks under the assumption of a spherical geometry. The results obtained under the proportionality of p and e, are compared with the results obtained without this assumption. A substantial difference is found, which suggests that great care must be exercised in assuming the proportionality of p and e. The last application is reported in Chapter 4 and it investigates the pathogenesis of normal pressure hydrocephalus. In a normal brain, cerebrospinal fluid (CSF) is created by the choroid plexus, circulates around the brain and the spinal cord without any impediment, and then is absorbed at various sites. However, normal pressure hydrocephalus occurs when there is an imbalance between the production and absorption of CSF in the brain that causes the impaired clearance of CSF and the enlargement of ventricles; however, the ventricular pressure in this case is frequently measured to be normal. Thus, a mathematical model using Mixture Theory is formulated to analyze a possible explanation of this brain condition. Levine (1999) proposed the hypothesis that CSF seeps from the ventricular space into the brain parenchyma and is efficiently absorbed in the bloodstream. To test this hypothesis, Levine used the consolidation theory version of poroelasticity theory, with the addition of Starling's law to account for the absorption of CSF in the brain parenchyma at steady state. However, the Mixture Theory model does not agree with the results obtained by Levine (1999) which leads one to conclude that the pathogenesis of normal pressure hydrocephalus remains unknown. To conclude the thesis, all three applications of Mixture Theory are discussed and the importance and contribution of this work is highlighted. In addition, possible future directions are indicated based on the findings of this thesis.
|
6 |
Predicting dementia status from Mini-Mental State Exam scores using group-based trajectory modellingBrown, Cassandra Lynn 24 August 2012 (has links)
Background: Longitudinal studies enable the study of within person change over time in addition to between person differences. In longitudinal studies of older adult populations even when not the question of interest, identifying participants with dementia is desirable, and often necessary. Yet in practice, the time to collect information from each participant may be limited. Therefore, some studies include only a brief general cognitive measure of which the Mini Mental State Examination (MMSE) is the most commonly used (Raina et al., 2009). The current study explores whether group-based trajectory modeling of MMSE scores with a selection of covariates can identify individuals who have or will develop dementia in an 8 year longitudinal study. Methods: The sample included 651 individuals from the Origins of Variance in the Oldest Old study of Swedish twins 80 years old or older (OCTO-Twin). Participants had completed the MMSE every two years, and cases of dementia were diagnosed according to DSM-III criteria. The accuracy of using the classes formed in growth mixture modeling and latent class growth modeling as indicative of dementia status was compared to that of more standard methods, the typical 24/30 cut score and a logistic regression. Results: A three-class quadratic model with covariate effects on class membership was found to best characterize the data. The classes were characterized as High Performing Late Decline, Rapidly Declining, and Decreasing Low Performance, and were labeled as such. Comparing the diagnostic accuracy of the latent trajectory groups against simple methods; the sensitivity of the final model was lower but it was the same or superior in specificity, positive predictive value, negative predictive value, and allowed a more fine-grained analysis of participant risk. Conclusions: Group-based trajectory models may be helpful for grouping longitudinal study participants, particularly if sensitivity is not the primary concern. / Graduate
|
7 |
Statistical inference on a mixture model /Tao, Lit-wai. January 1993 (has links)
Thesis (M. Soc. Sc.)--University of Hong Kong, 1993. / Includes bibliographical references.
|
8 |
Über die physiologische wirkung der kupfervitriolkalkbrühe ...Schander, Richard, January 1904 (has links)
Inaug.-diss.--Jena. / Lebenslauf.
|
9 |
Statistical inference on a mixture modelTao, Lit-wai. January 1993 (has links)
Thesis (M.Soc.Sc.)--University of Hong Kong, 1993. / Includes bibliographical references. Also available in print.
|
10 |
Automatic architecture selection for probability density function estimation in computer visionSadeghi, Mohammad T. January 2002 (has links)
In this thesis, the problem of probability density function estimation using finite mixture models is considered. Gaussian mixture modelling is used to provide a semi-parametric density estimate for a given data set. The fundamental problem with this approach is that the number of mixtures required to adequately describe the data is not known in advance. In this work, a predictive validation technique [91] is studied and developed as a useful, operational tool that automatically selects the number of components for Gaussian mixture models. The predictive validation test approves a candidate model if, for the set of events they try to predict, the predicted frequencies derived from the model match the empirical ones derived from the data set. A model selection algorithm, based on the validation test, is developed which prevents both problems of over-fitting and under-fitting. We investigate the influence of the various parameters in the model selection method in order to develop it into a robust operational tool. The capability of the proposed method in real world applications is examined on the problem of face image segmentation for automatic initialisation of lip tracking systems. A segmentation approach is proposed which is based on Gaussian mixture modelling of the pixels RGB values using the predictive validation technique. The lip region segmentation is based on the estimated model. First a grouping of the model components is performed using a novel approach. The resulting groups are then the basis of a Bayesian decision making system which labels the pixels in the mouth area as lip or non-lip. The experimental results demonstrate the superiority of the method over the conventional clustering approaches. In order to improve the method computationally an image sampling technique is applied which is based on Sobol sequences. Also, the image modelling process is strengthened by incorporating spatial contextual information using two different methods, a Neigh-bourhood Expectation Maximisation technique and a spatial clustering method based on a Gibbs/Markov random field modelling approach. Both methods are developed within the proposed modelling framework. The results obtained on the lip segmentation application suggest that spatial context is beneficial.
|
Page generated in 0.0413 seconds