Spelling suggestions: "subject:"chisquare distribution"" "subject:"chisquare distribution""
1 |
The Herfindahl-Hirschman Index as an official statistic of business concentration : challenges and solutionsDjolov, George Georgiev 12 1900 (has links)
Thesis (PhD)--Stellenbosch University, 2012. / ENGLISH ABSTRACT: This dissertation examines the measurement of business concentration by the Herfindahl-
Hirschman Index (HHI). In the course of the examination, a modification to this method of
measurement of business concentration is proposed, in terms of which the accuracy of the
conventional depiction of the HHI can be enhanced by a formulation involving the Gini index.
Computational advantages in the use of this new method are identified, which reveal the Ginibased
HHI to be an effective substitute for its regular counterpart. It is found that theoretically and
in practice, the proposed new method has strengths that favour its usage. The practical
advantages of employing this method are considered with a view to encouraging the measurement
of business concentration using the Gini-based index of the HHI. / AFRIKAANSE OPSOMMING: Hierdie verhandeling ondersoek die meting van sakekonsentrasie deur middel van die Herfindahl-
Hirschman-indeks (HHI). ‘n Wysiging aan hierdie metode word voorgestel, deur middel waarvan
die akkuraatheid van die konvensionele voorstelling van die HHI verhoog word, deur ‘n
formulering wat die Gini-indeks betrek. Die berekeningsvoordele van hierdie nuwe metode word
geïdentifiseer en dit word aangetoon dat die Gini-gebaseerde HHI ’n doeltreffende plaasvervanger
vir sy meer bekende teenvoeter is. Daar word bevind dat die voorgestelde nuwe metode
teoretiese en praktiese sterkpunte het wat die gebruik daarvan ondersteun. Die praktiese voordele
van die voorgestelde metode word oorweeg met die oog op die aanmoediging van die gebruik van
die Gini-gebaseerde HHI-indeks as maatstaf van sakekonsentrasie.
|
2 |
Tonsils : a risk factor for moderate and severe chronic periodontitis? /Wynn, William Bernard. January 2002 (has links) (PDF)
Thesis--University of Oklahoma. / Includes bibliographical references (leaves 39-41).
|
3 |
Mixture distributions with application to microarray data analysisLynch, O'Neil 01 June 2009 (has links)
The main goal in analyzing microarray data is to determine the genes that are differentially expressed across two types of tissue samples or samples obtained under two experimental conditions. In this dissertation we proposed two methods to determine differentially expressed genes. For the penalized normal mixture model (PMMM) to determine genes that are differentially expressed, we penalized both the variance and the mixing proportion parameters simultaneously. The variance parameter was penalized so that the log-likelihood will be bounded, while the mixing proportion parameter was penalized so that its estimates are not on the boundary of its parametric space. The null distribution of the likelihood ratio test statistic (LRTS) was simulated so that we could perform a hypothesis test for the number of components of the penalized normal mixture model. In addition to simulating the null distribution of the LRTS for the penalized normal mixture model, we showed that the maximum likelihood estimates were asymptotically normal, which is a first step that is necessary to prove the asymptotic null distribution of the LRTS. This result is a significant contribution to field of normal mixture model.
The modified p-value approach for detecting differentially expressed genes was also discussed in this dissertation. The modified p-value approach was implemented so that a hypothesis test for the number of components can be conducted by using the modified likelihood ratio test. In the modified p-value approach we penalized the mixing proportion so that the estimates of the mixing proportion are not on the boundary of its parametric space. The null distribution of the (LRTS) was simulated so that the number of components of the uniform beta mixture model can be determined. Finally, for both modified methods, the penalized normal mixture model and the modified p-value approach were applied to simulated and real data.
|
4 |
SOME CONTRIBUTIONS TO THE CENSORED EMPIRICAL LIKELIHOOD WITH HAZARD-TYPE CONSTRAINTSHu, Yanling 01 January 2011 (has links)
Empirical likelihood (EL) is a recently developed nonparametric method of statistical inference. Owen’s 2001 book contains many important results for EL with uncensored data. However, fewer results are available for EL with right-censored data. In this dissertation, we first investigate a right-censored-data extension of Qin and Lawless (1994). They studied EL with uncensored data when the number of estimating equations is larger than the number of parameters (over-determined case). We obtain results similar to theirs for the maximum EL estimator and the EL ratio test, for the over-determined case, with right-censored data. We employ hazard-type constraints which are better able to handle right-censored data. Then we investigate EL with right-censored data and a k-sample mixed hazard-type constraint. We show that the EL ratio test statistic has a limiting chi-square distribution when k = 2. We also study the relationship between the constrained Kaplan-Meier estimator and the corresponding Nelson-Aalen estimator. We try to prove that they are asymptotically equivalent under certain conditions. Finally we present simulation studies and examples showing how to apply our theory and methodology with real data.
|
5 |
STATISTICAL MODELS FOR CONSTANT FALSE-ALARM RATE THRESHOLD ESTIMATION IN SOUND SOURCE DETECTION SYSTEMSSaghaian Nejad Esfahani, Sayed Mahdi 01 January 2010 (has links)
Constant False Alarm Rate (CFAR) Processors are important for applications where thousands of detection tests are made per second, such as in radar. This thesis introduces a new method for CFAR threshold estimation that is particularly applicable to sound source detection with distributed microphone systems. The novel CFAR Processor exploits the near symmetry about 0 for the acoustic pixel values created by steered-response coherent power in conjunction with a partial whitening preprocessor to estimate thresholds for positive values, which represent potential targets.
To remove the low frequency components responsible for degrading CFAR performance, fixed and adaptive high-pass filters are applied. A relation is proposed and it tested the minimum high-pass cut-off frequency and the microphone geometry.
Experimental results for linear, perimeter and planar arrays illustrate that for desired false alarm (FA) probabilities ranging from 10-1 and 10-6, a good CFAR performance can be achieved by modeling the coherent power with Chi-square and Weibull distributions and the ratio of desired over experimental FA probabilities can be limited within an order of magnitude.
|
6 |
Hypothesis testing based on pool screening with unequal pool sizesGao, Hongjiang. January 2010 (has links) (PDF)
Thesis (Ph.D.)--University of Alabama at Birmingham, 2010. / Title from PDF title page (viewed on June 28, 2010). Includes bibliographical references.
|
7 |
Efficient Numerical Inversion for Financial SimulationsDerflinger, Gerhard, Hörmann, Wolfgang, Leydold, Josef, Sak, Halis January 2009 (has links) (PDF)
Generating samples from generalized hyperbolic distributions and non-central chi-square distributions by inversion has become an important task for the simulation of recent models in finance in the framework of (quasi-) Monte Carlo. However, their distribution functions are quite expensive to evaluate and thus numerical methods like root finding algorithms are extremely slow. In this paper we demonstrate how our new method based on Newton interpolation and Gauss-Lobatto quadrature can be utilized for financial applications. Its fast marginal generation times make it competitive, even for situations where the parameters are not always constant. / Series: Research Report Series / Department of Statistics and Mathematics
|
8 |
Cluster-based lack of fit tests for nonlinear regression modelsMunasinghe, Wijith Prasantha January 1900 (has links)
Doctor of Philosophy / Department of Statistics / James W. Neill / Checking the adequacy of a proposed parametric nonlinear regression model is important
in order to obtain useful predictions and reliable parameter inferences. Lack of fit is said to
exist when the regression function does not adequately describe the mean of the response
vector. This dissertation considers asymptotics, implementation and a comparative performance
for the likelihood ratio tests suggested by Neill and Miller (2003). These tests use
constructed alternative models determined by decomposing the lack of fit space according to
clusterings of the observations. Clusterings are selected by a maximum power strategy and a
sequence of statistical experiments is developed in the sense of Le Cam. L2 differentiability
of the parametric array of probability measures associated with the sequence of experiments
is established in this dissertation, leading to local asymptotic normality. Utilizing contiguity,
the limit noncentral chi-square distribution under local parameter alternatives is then
derived. For implementation purposes, standard linear model projection algorithms are
used to approximate the likelihood ratio tests, after using the convexity of a class of fuzzy
clusterings to form a smooth alternative model which is necessarily used to approximate the
corresponding maximum optimal statistical experiment. It is demonstrated empirically that
good power can result by allowing cluster selection to vary according to different points along
the expectation surface of the proposed nonlinear regression model. However, in some cases,
a single maximum clustering suffices, leading to the development of a Bonferroni adjusted
multiple testing procedure. In addition, the maximin clustering based likelihood ratio tests
were observed to possess markedly better simulated power than the generalized likelihood
ratio test with semiparametric alternative model presented by Ciprian and Ruppert (2004).
|
9 |
Rates of convergence of variance-gamma approximations via Stein's methodGaunt, Robert E. January 2013 (has links)
Stein's method is a powerful technique that can be used to obtain bounds for approximation errors in a weak convergence setting. The method has been used to obtain approximation results for a number of distributions, such as the normal, Poisson and Gamma distributions. A major strength of the method is that it is often relatively straightforward to apply it to problems involving dependent random variables. In this thesis, we consider the adaptation of Stein's method to the class of Variance-Gamma distributions. We obtain a Stein equation for the Variance-Gamma distributions. Uniform bounds for the solution of the Symmetric Variance-Gamma Stein equation and its first four derivatives are given in terms of the supremum norms of derivatives of the test function. New formulas and inequalities for modified Bessel functions are obtained, which allow us to obtain these bounds. We then use local approach couplings to obtain bounds on the error in approximating two asymptotically Variance-Gamma distributed statistics by their limiting distribution. In both cases, we obtain a convergence rate of order n<sup>-1</sup> for suitably smooth test functions. The product of two normal random variables has a Variance-Gamma distribution and this leads us to consider the development of Stein's method to the product of r independent mean-zero normal random variables. An elegant Stein equation is obtained, which motivates a generalisation of the zero bias transformation. This new transformation has a number of interesting properties, which we exploit to prove some limit theorems for statistics that are asymptotically distributed as the product of two central normal distributions. The Variance-Gamma and Product Normal distributions arise as functions of the multivariate normal distribution. We end this thesis by demonstrating how the multivariate normal Stein equation can be used to prove limit theorems for statistics that are asymptotically distributed as a function of the multivariate normal distribution. We establish some sufficient conditions for convergence rates to be of order n<sup>-1</sup> for smooth test functions, and thus faster than the O(n<sup>-1/2</sup>) rate that would arise from the Berry-Esseen Theorem. We apply the multivariate normal Stein equation approach to prove Variance-Gamma and Product Normal limit theorems, and we also consider an application to Friedman's X<sup>2</sup> statistic.
|
10 |
統計品管中製程能力指標決策程序之研究 / Some Decision Procedures For The Capability Index In Quality Control Process李仁棻, Lee, Ren Fen Unknown Date (has links)
製程能力指標(process capability index)常被用來評量製程能力的高低。它結合規格及製程變異於一指標,便利使用者易於了解指標的意義。
若吾人主張一製程能力大於某一定值,當同時控制型I及型II過誤,這時,臨界值(critical value)及樣本大小n即可決定。若同時存在有數個大於某一定值的製造過程,吾人欲挑選具有最大製程能力的製程,這時,我們提出一個客觀的準則來加以選擇。
本篇論文的特色是以解析法來決定臨界值及樣本大小n,並於挑選最大的製程能力時能提出一個客觀的挑選準則。
研究中發現:雖然逼近常用的統計上查表值時有些誤差,但誤差不大。故本文討論的過程中所用的方法及結論,適用於線上作業。
|
Page generated in 0.0865 seconds