• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 3
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 17
  • 17
  • 17
  • 7
  • 6
  • 5
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

The Herfindahl-Hirschman Index as an official statistic of business concentration : challenges and solutions

Djolov, George Georgiev 12 1900 (has links)
Thesis (PhD)--Stellenbosch University, 2012. / ENGLISH ABSTRACT: This dissertation examines the measurement of business concentration by the Herfindahl- Hirschman Index (HHI). In the course of the examination, a modification to this method of measurement of business concentration is proposed, in terms of which the accuracy of the conventional depiction of the HHI can be enhanced by a formulation involving the Gini index. Computational advantages in the use of this new method are identified, which reveal the Ginibased HHI to be an effective substitute for its regular counterpart. It is found that theoretically and in practice, the proposed new method has strengths that favour its usage. The practical advantages of employing this method are considered with a view to encouraging the measurement of business concentration using the Gini-based index of the HHI. / AFRIKAANSE OPSOMMING: Hierdie verhandeling ondersoek die meting van sakekonsentrasie deur middel van die Herfindahl- Hirschman-indeks (HHI). ‘n Wysiging aan hierdie metode word voorgestel, deur middel waarvan die akkuraatheid van die konvensionele voorstelling van die HHI verhoog word, deur ‘n formulering wat die Gini-indeks betrek. Die berekeningsvoordele van hierdie nuwe metode word geïdentifiseer en dit word aangetoon dat die Gini-gebaseerde HHI ’n doeltreffende plaasvervanger vir sy meer bekende teenvoeter is. Daar word bevind dat die voorgestelde nuwe metode teoretiese en praktiese sterkpunte het wat die gebruik daarvan ondersteun. Die praktiese voordele van die voorgestelde metode word oorweeg met die oog op die aanmoediging van die gebruik van die Gini-gebaseerde HHI-indeks as maatstaf van sakekonsentrasie.
2

Comparison Between Confidence Intervals of Multiple Linear Regression Model with or without Constraints

Tao, Jinxin 27 April 2017 (has links)
Regression analysis is one of the most applied statistical techniques. The sta- tistical inference of a linear regression model with a monotone constraint had been discussed in early analysis. A natural question arises when it comes to the difference between the cases of with and without the constraint. Although the comparison be- tween confidence intervals of linear regression models with and without restriction for one predictor variable had been considered, this discussion for multiple regres- sion is required. In this thesis, I discuss the comparison of the confidence intervals between a multiple linear regression model with and without constraints.
3

Inference in Constrained Linear Regression

Chen, Xinyu 27 April 2017 (has links)
Regression analyses constitutes an important part of the statistical inference and has great applications in many areas. In some applications, we strongly believe that the regression function changes monotonically with some or all of the predictor variables in a region of interest. Deriving analyses under such constraints will be an enormous task. In this work, the restricted prediction interval for the mean of the regression function is constructed when two predictors are present. I use a modified likelihood ratio test (LRT) to construct prediction intervals.
4

Tonsils : a risk factor for moderate and severe chronic periodontitis? /

Wynn, William Bernard. January 2002 (has links) (PDF)
Thesis--University of Oklahoma. / Includes bibliographical references (leaves 39-41).
5

Mixture distributions with application to microarray data analysis

Lynch, O'Neil 01 June 2009 (has links)
The main goal in analyzing microarray data is to determine the genes that are differentially expressed across two types of tissue samples or samples obtained under two experimental conditions. In this dissertation we proposed two methods to determine differentially expressed genes. For the penalized normal mixture model (PMMM) to determine genes that are differentially expressed, we penalized both the variance and the mixing proportion parameters simultaneously. The variance parameter was penalized so that the log-likelihood will be bounded, while the mixing proportion parameter was penalized so that its estimates are not on the boundary of its parametric space. The null distribution of the likelihood ratio test statistic (LRTS) was simulated so that we could perform a hypothesis test for the number of components of the penalized normal mixture model. In addition to simulating the null distribution of the LRTS for the penalized normal mixture model, we showed that the maximum likelihood estimates were asymptotically normal, which is a first step that is necessary to prove the asymptotic null distribution of the LRTS. This result is a significant contribution to field of normal mixture model. The modified p-value approach for detecting differentially expressed genes was also discussed in this dissertation. The modified p-value approach was implemented so that a hypothesis test for the number of components can be conducted by using the modified likelihood ratio test. In the modified p-value approach we penalized the mixing proportion so that the estimates of the mixing proportion are not on the boundary of its parametric space. The null distribution of the (LRTS) was simulated so that the number of components of the uniform beta mixture model can be determined. Finally, for both modified methods, the penalized normal mixture model and the modified p-value approach were applied to simulated and real data.
6

SOME CONTRIBUTIONS TO THE CENSORED EMPIRICAL LIKELIHOOD WITH HAZARD-TYPE CONSTRAINTS

Hu, Yanling 01 January 2011 (has links)
Empirical likelihood (EL) is a recently developed nonparametric method of statistical inference. Owen’s 2001 book contains many important results for EL with uncensored data. However, fewer results are available for EL with right-censored data. In this dissertation, we first investigate a right-censored-data extension of Qin and Lawless (1994). They studied EL with uncensored data when the number of estimating equations is larger than the number of parameters (over-determined case). We obtain results similar to theirs for the maximum EL estimator and the EL ratio test, for the over-determined case, with right-censored data. We employ hazard-type constraints which are better able to handle right-censored data. Then we investigate EL with right-censored data and a k-sample mixed hazard-type constraint. We show that the EL ratio test statistic has a limiting chi-square distribution when k = 2. We also study the relationship between the constrained Kaplan-Meier estimator and the corresponding Nelson-Aalen estimator. We try to prove that they are asymptotically equivalent under certain conditions. Finally we present simulation studies and examples showing how to apply our theory and methodology with real data.
7

STATISTICAL MODELS FOR CONSTANT FALSE-ALARM RATE THRESHOLD ESTIMATION IN SOUND SOURCE DETECTION SYSTEMS

Saghaian Nejad Esfahani, Sayed Mahdi 01 January 2010 (has links)
Constant False Alarm Rate (CFAR) Processors are important for applications where thousands of detection tests are made per second, such as in radar. This thesis introduces a new method for CFAR threshold estimation that is particularly applicable to sound source detection with distributed microphone systems. The novel CFAR Processor exploits the near symmetry about 0 for the acoustic pixel values created by steered-response coherent power in conjunction with a partial whitening preprocessor to estimate thresholds for positive values, which represent potential targets. To remove the low frequency components responsible for degrading CFAR performance, fixed and adaptive high-pass filters are applied. A relation is proposed and it tested the minimum high-pass cut-off frequency and the microphone geometry. Experimental results for linear, perimeter and planar arrays illustrate that for desired false alarm (FA) probabilities ranging from 10-1 and 10-6, a good CFAR performance can be achieved by modeling the coherent power with Chi-square and Weibull distributions and the ratio of desired over experimental FA probabilities can be limited within an order of magnitude.
8

Hypothesis testing based on pool screening with unequal pool sizes

Gao, Hongjiang. January 2010 (has links) (PDF)
Thesis (Ph.D.)--University of Alabama at Birmingham, 2010. / Title from PDF title page (viewed on June 28, 2010). Includes bibliographical references.
9

Constrained Statistical Inference in Regression

Peiris, Thelge Buddika 01 August 2014 (has links)
Regression analysis constitutes a large portion of the statistical repertoire in applications. In case where such analysis is used for exploratory purposes with no previous knowledge of the structure one would not wish to impose any constraints on the problem. But in many applications we are interested in a simple parametric model to describe the structure of a system with some prior knowledge of the structure. An important example of this occurs when the experimenter has the strong belief that the regression function changes monotonically in some or all of the predictor variables in a region of interest. The analyses needed for statistical inference under such constraints are nonstandard. The specific aim of this study is to introduce a technique which can be used for statistical inferences of a multivariate simple regression with some non-standard constraints.
10

Efficient Numerical Inversion for Financial Simulations

Derflinger, Gerhard, Hörmann, Wolfgang, Leydold, Josef, Sak, Halis January 2009 (has links) (PDF)
Generating samples from generalized hyperbolic distributions and non-central chi-square distributions by inversion has become an important task for the simulation of recent models in finance in the framework of (quasi-) Monte Carlo. However, their distribution functions are quite expensive to evaluate and thus numerical methods like root finding algorithms are extremely slow. In this paper we demonstrate how our new method based on Newton interpolation and Gauss-Lobatto quadrature can be utilized for financial applications. Its fast marginal generation times make it competitive, even for situations where the parameters are not always constant. / Series: Research Report Series / Department of Statistics and Mathematics

Page generated in 0.0949 seconds