• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 53
  • 8
  • 6
  • 6
  • 5
  • 4
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • Tagged with
  • 127
  • 127
  • 56
  • 22
  • 20
  • 20
  • 16
  • 13
  • 13
  • 12
  • 12
  • 11
  • 11
  • 11
  • 10
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Using Verbal Protocol Analysis to Explore Canadian Consumers' Comprehension of the Nutrition Facts Table

French, Laura J 13 August 2012 (has links)
The current study compared participants’ ability to perform tasks using two nutrition labels: a control Nutrition Facts table of the current Canadian format (n=64), and an experimental label (n=64), identical to the control label with the exception of a footnote explaining how to interpret percent daily values. A 25% subset of participants answered questions using a think aloud technique, and data was analyzed using content analysis. The main outcome measured was ability to interpret percentages correctly, with ability to compare, define and manipulate information as secondary outcomes. No significant differences were seen in ability to perform tasks between the experimental and control conditions for any outcomes. As determined by chi square tests, higher performance was associated with higher education, being male, and report of previous Nutrition Facts table use. Verbal protocol analysis identified that interpretation of percentages was based on the meal, food type, and comparison to other foods.
42

SOME CONTRIBUTIONS TO THE CENSORED EMPIRICAL LIKELIHOOD WITH HAZARD-TYPE CONSTRAINTS

Hu, Yanling 01 January 2011 (has links)
Empirical likelihood (EL) is a recently developed nonparametric method of statistical inference. Owen’s 2001 book contains many important results for EL with uncensored data. However, fewer results are available for EL with right-censored data. In this dissertation, we first investigate a right-censored-data extension of Qin and Lawless (1994). They studied EL with uncensored data when the number of estimating equations is larger than the number of parameters (over-determined case). We obtain results similar to theirs for the maximum EL estimator and the EL ratio test, for the over-determined case, with right-censored data. We employ hazard-type constraints which are better able to handle right-censored data. Then we investigate EL with right-censored data and a k-sample mixed hazard-type constraint. We show that the EL ratio test statistic has a limiting chi-square distribution when k = 2. We also study the relationship between the constrained Kaplan-Meier estimator and the corresponding Nelson-Aalen estimator. We try to prove that they are asymptotically equivalent under certain conditions. Finally we present simulation studies and examples showing how to apply our theory and methodology with real data.
43

Contaminated Chi-square Modeling and Its Application in Microarray Data Analysis

Zhou, Feng 01 January 2014 (has links)
Mixture modeling has numerous applications. One particular interest is microarray data analysis. My dissertation research is focused on the Contaminated Chi-Square (CCS) Modeling and its application in microarray. A moment-based method and two likelihood-based methods including Modified Likelihood Ratio Test (MLRT) and Expectation-Maximization (EM) Test are developed for testing the omnibus null hypothesis of no contamination of a central chi-square distribution by a non-central Chi-Square distribution. When the omnibus null hypothesis is rejected, we further developed the moment-based test and the EM test for testing an extra component to the Contaminated Chi-Square (CCS+EC) Model. The moment-based approach is easy and there is no need for re-sampling or random field theory to obtain critical values. When the statistical models are complicated such as large mixtures of dimensional distributions, MLRT and EM test may have better power than moment based approaches, and the MLRT and EM tests developed herein enjoy an elegant asymptotic theory.
44

STATISTICAL MODELS FOR CONSTANT FALSE-ALARM RATE THRESHOLD ESTIMATION IN SOUND SOURCE DETECTION SYSTEMS

Saghaian Nejad Esfahani, Sayed Mahdi 01 January 2010 (has links)
Constant False Alarm Rate (CFAR) Processors are important for applications where thousands of detection tests are made per second, such as in radar. This thesis introduces a new method for CFAR threshold estimation that is particularly applicable to sound source detection with distributed microphone systems. The novel CFAR Processor exploits the near symmetry about 0 for the acoustic pixel values created by steered-response coherent power in conjunction with a partial whitening preprocessor to estimate thresholds for positive values, which represent potential targets. To remove the low frequency components responsible for degrading CFAR performance, fixed and adaptive high-pass filters are applied. A relation is proposed and it tested the minimum high-pass cut-off frequency and the microphone geometry. Experimental results for linear, perimeter and planar arrays illustrate that for desired false alarm (FA) probabilities ranging from 10-1 and 10-6, a good CFAR performance can be achieved by modeling the coherent power with Chi-square and Weibull distributions and the ratio of desired over experimental FA probabilities can be limited within an order of magnitude.
45

Developing an assessment tool for measuring total quality management in SASOL's Steam Station Plant / L.E. Amorighoye

Amorighoye, Lucky Eyituoyo January 2009 (has links)
Thesis (M.Ing. (Development and Management Engineering))--North-West University, Potchefstroom Campus, 2009.
46

Developing an assessment tool for measuring total quality management in SASOL's Steam Station Plant / L.E. Amorighoye

Amorighoye, Lucky Eyituoyo January 2009 (has links)
Thesis (M.Ing. (Development and Management Engineering))--North-West University, Potchefstroom Campus, 2009.
47

Methods of calibration for the empirical likelihood ratio

Jiang, Li January 2006 (has links)
This thesis provides several new calibration methods for the empirical log-likelihood ratio. The commonly used Chi-square calibration is based on the limiting distribu¬tion of this ratio but it constantly suffers from the undercoverage problem. The finite sample distribution of the empirical log-likelihood ratio is recognized to have a mix¬ture structure with a continuous component on [0, +∞) and a probability mass at +∞. Consequently, new calibration methods are developed to take advantage of this mixture structure; we propose new calibration methods based on the mixture distrib¬utions, such as the mixture Chi-square and the mixture Fisher's F distribution. The E distribution introduced in Tsao (2004a) has a natural mixture structure and the calibration method based on this distribution is considered in great details. We also discuss methods of estimating the E distributions.
48

Hypothesis testing based on pool screening with unequal pool sizes

Gao, Hongjiang. January 2010 (has links) (PDF)
Thesis (Ph.D.)--University of Alabama at Birmingham, 2010. / Title from PDF title page (viewed on June 28, 2010). Includes bibliographical references.
49

A comparison of four estimators of a population measure of model misfit in covariance structure analysis

Zhang, Wei. January 2005 (has links)
Thesis (M. A.)--University of Notre Dame, 2005. / Thesis directed by Ke-Hai Yuan for the Department of Psychology. "October 2005." Includes bibliographical references (leaves 60-63).
50

Efficient Numerical Inversion for Financial Simulations

Derflinger, Gerhard, Hörmann, Wolfgang, Leydold, Josef, Sak, Halis January 2009 (has links) (PDF)
Generating samples from generalized hyperbolic distributions and non-central chi-square distributions by inversion has become an important task for the simulation of recent models in finance in the framework of (quasi-) Monte Carlo. However, their distribution functions are quite expensive to evaluate and thus numerical methods like root finding algorithms are extremely slow. In this paper we demonstrate how our new method based on Newton interpolation and Gauss-Lobatto quadrature can be utilized for financial applications. Its fast marginal generation times make it competitive, even for situations where the parameters are not always constant. / Series: Research Report Series / Department of Statistics and Mathematics

Page generated in 0.0386 seconds