• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 51
  • 26
  • 20
  • 10
  • 4
  • 3
  • 3
  • 2
  • 1
  • 1
  • Tagged with
  • 144
  • 144
  • 37
  • 37
  • 33
  • 27
  • 18
  • 16
  • 14
  • 14
  • 14
  • 13
  • 13
  • 12
  • 12
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Begging the question : permanent income and social mobility

Muller, Seán Mfundza January 2007 (has links)
Includes abstract. Includes bibliographical references (p. 35-37).
12

Measurement Error in Designed Experiments for Second Order Models

McMahan, Angela Renee 11 April 1997 (has links)
Measurement error (ME) in the factor levels of designed experiments is often overlooked in the planning and analysis of experimental designs. A familiar model for this type of ME, called the Berkson error model, is discussed at length. Previous research has examined the effect of Berkson error on two-level factorial and fractional factorial designs. This dissertation extends the examination to designs for second order models. The results are used to suggest optimal values for axial points in Central Composite Designs. The proper analysis for experimental data including ME is outlined for first and second order models. A comparison of this analysis to a typical Ordinary Least Squares analysis is made for second order models. The comparison is used to quantify the difference in performance of the two methods, both of which yield unbiased coefficient estimates. Robustness to misspecification of the ME variance is also explored. A solution for experimental planning is also suggested. A design optimality criterion, called the DME criterion, is used to create a second-stage design when ME is present. The performance of the criterion is compared to a D-optimal design augmentation. A final comparison is made between methods accounting for ME and methods ignoring ME. / Ph. D.
13

Avaliação de métodos estatísticos na análise de dados de consumo alimentar

Paschoalinotte, Eloisa Elena [UNESP] 17 December 2009 (has links) (PDF)
Made available in DSpace on 2014-06-11T19:30:59Z (GMT). No. of bitstreams: 0 Previous issue date: 2009-12-17Bitstream added on 2014-06-13T21:01:49Z : No. of bitstreams: 1 paschoalinotte_ee_me_botib.pdf: 355804 bytes, checksum: f6f7da3741a371f0a44fb543773dfea3 (MD5) / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES) / Universidade Estadual Paulista (UNESP) / A avaliação do consumo alimentar de um indivíduo ou de uma população tem sido um desafio tanto para profissionais da área de nutrição como para os da área de estatística. Isso porque a característica central do consumo alimentar é a variabilidade da dieta, que pode gerar uma grande variabilidade inter e intrapessoal. Para contornar esse problema, métodos estatísticos apropriados foram desenvolvidos baseados no modelo de regressão com erro de medida de modo a se obter a distribuição estimada do consumo habitual. Dentre os métodos de avaliação de consumo, têm-se o método da Iowa State University (ISU), o método da Iowa State University for Foods (I8UF) e o método do National Cancer Institute (NCI). Todos esses métodos são baseados no modelo com erro de medida incorporando a questão do consumo esporádico (método I8UF) e a possibilidade de incluir covariáveis que podem interferir na distribuição estimada do consumo (método NCI). Para o uso do método ISU, foi desenvolvido um programa chamado PC-SIDE (Software for Intake Distribution Estimate), que fornece a distribuição do consumo habitual bem como a probabilidade de inadequação de determinados nutrientes de acordo com as recomendações nutricionais. Com esse mesmo programa, é possível obter a distribuição do consumo habitual esporádico, dado pelo método ISUF. Para o método NCI, foram desenvolvidas macros no programa SAS (Statistical Analysis System) que permitem incluir covariáveis e estimar a distribuição do consumo habitual baseado no modelo com erros de medidas. Desse modo, o objetivo deste trabalho foi avaliar essas metodologias estatísticas na análise de dados de consumo alimentar e aplicá-los a um conjunto de dados de um levantamento nutricional de idosos. Foram estudadas as metodologias de ajuste dos modelos propostos para a obtenção da distribuição estimada de consumo baseado... / Evaluating an individual's or a population's food intake has been a challenge for both nutrition professionals and statisticians. This is so because the main characteristic of food intake is diet variety, which may generate large betweenand within-person variability. In order to overcome this problem, appropriate statistical methods have been developed based on the measurement-error regression model so as to obtain the estimated distribution of usual intake. Among the intake evaluation methods are the Iowa State University (lSU), the lowa State University for Foods (ISUF) and the National Cancer lnstitute (NCI) methods. All of them are based on the measurement- error model incorporating the issue concerning episodic intake (ISUF method) and the possibility of including covariates that can interfere in the intake estimated distribution (NCl method). ln order to use the lSU method, a software referred to as PC-SlDE (Software for Intake Distribution Estimate) was designed. It provides the usual intake distribution as well as the probability of inadequacy for certain nutrients according to nutritional recommendations. By using the same software, it is possible to obtain the distribution of episodic usual intake given by the ISUF method. For the NCI method, macros were developed in the SAS (Statistical Analysis System) software which enable the inclusion of covariates and the estimation of the usual intake distribution based on the measurement-error IDodel. Hence, this study aimed at evaluating these statistical methodologies in the analysis of food intake data and at applying them to a data set for a nutritional assessment of elderly individuaIs. The fitting methodologies for the models proposed to obtain the estimated intake distribution based on the ISU, ISUF and NCI methods were studied. The ISU and NCI methods were applied to data from three 24-hours recalls obtained fram a study... (Complete abstract click electronic access below)
14

Topics in measurement error and missing data problems

Liu, Lian 15 May 2009 (has links)
No description available.
15

Efficient Small Area Estimation in the Presence of Measurement Error in Covariates

Singh, Trijya 2011 August 1900 (has links)
Small area estimation is an arena that has seen rapid development in the past 50 years, due to its widespread applicability in government projects, marketing research and many other areas. However, it is often difficult to obtain error-free data for this purpose. In this dissertation, each project describes a model used for small area estimation in which the covariates are measured with error. We applied different methods of bias correction to improve the estimates of the parameter of interest in the small areas. There is a variety of methods available for bias correction of estimates in the presence of measurement error. We applied the simulation extrapolation (SIMEX), ordinary corrected scores and Monte Carlo corrected scores methods of bias correction in the Fay-Herriot model, and investigated the performance of the bias-corrected estimators. The performance of the estimators in the presence of non-normal measurement error and of the SIMEX estimator in the presence of non-additive measurement error was also studied. For each of these situations, we presented simulation studies to observe the performance of the proposed correction procedures. In addition, we applied our proposed methodology to analyze a real life, nontrivial data set and present the results. We showed that the Lohr-Ybarra estimator is slightly inefficient and that applying methods of bias correction like SIMEX, corrected scores or Monte Carlo corrected scores (MCCS) increases the efficiency of the small area estimates. In particular, we showed that the simulation based bias correction methods like SIMEX and MCCS provide a greater gain in efficiency. We also showed that the SIMEX method of bias correction is robust with respect to departures from normality or additivity of measurement error. We showed that the MCCS method is robust with respect to departure from normality of measurement error.
16

Topics in measurement error and missing data problems

Liu, Lian 15 May 2009 (has links)
No description available.
17

Measurement error in environmental exposures: Statistical implications for spatial air pollution models and gene environment interaction tests

Ackerman-Alexeeff, Stacey Elizabeth 15 October 2013 (has links)
Measurement error is an important issue in studies of environmental epidemiology. We considered the effects of measurement error in environmental covariates in several important settings affecting current public health research. Throughout this dissertation, we investigate the impacts of measurement error and consider statistical methodology to fix that error.
18

Efficient Semiparametric Estimators for Biological, Genetic, and Measurement Error Applications

Garcia, Tanya 2011 August 1900 (has links)
Many statistical models, like measurement error models, a general class of survival models, and a mixture data model with random censoring, are semiparametric where interest lies in estimating finite-dimensional parameters in the presence of infinite-dimensional nuisance parameters. Developing efficient estimators for the parameters of interest in these models is important because such estimators provide better inferences. For a general regression model with measurement error, we utilize semiparametric theory to develop an unprecedented estimation procedure which delivers consistent estimators even when the model error and latent variable distributions are misspecified. Until now, root-$n$ consistent estimators for this setting were not attainable except for special cases, like a polynomial relationship between the response and mismeasured variables. Through simulation studies and a nutrition study application, we demonstrate that our method outperforms existing methods which ignore measurement error or require a correct model error distribution. In randomized clinical trials, scientists often compare two-sample survival data with a log-rank test. The two groups typically have nonproportional hazards, however, and using a log rank test results in substantial power loss. To ameliorate this issue and improve model efficiency, we propose a model-free strategy of incorporating auxiliary covariates in a general class of survival models. Our approach produces an unbiased, asymptotically normal estimator with significant efficiency gains over current methods. Lastly, we apply semiparametric theory to mixture data models common in kin-cohort designs of Huntington's disease where interest lies in comparing the estimated age-at-death distributions for disease gene carriers and non-carriers. The distribution of the observed, possibly censored, outcome is a mixture of the genotype-specific distributions where the mixing proportions are computed based on the genotypes which are independent of the trait outcomes. Current methods for such data include a Cox proportional hazards model which is susceptible to model misspecification, and two types of nonparametric maximum likelihood estimators which are either inefficient or inconsistent. Using semiparametric theory, we propose an inverse probability weighting estimator (IPW), a nonparametrically imputed estimator and an optimal augmented IPW estimator which provide more reasonable estimates for the age-at-death distributions, and are not susceptible to model misspecification nor poor efficiencies.
19

Avaliação de métodos estatísticos na análise de dados de consumo alimentar /

Paschoalinotte, Eloisa Elena. January 2009 (has links)
Orientador: José Eduardo Corrente / Banca: Dirce Maria Lobo Marchioni / Banca: Lídia Raquel de Carvalho / Banca: Liciana Vaz de Arruda Silveira / Banca: Regina Mara Fisberg / Resumo: A avaliação do consumo alimentar de um indivíduo ou de uma população tem sido um desafio tanto para profissionais da área de nutrição como para os da área de estatística. Isso porque a característica central do consumo alimentar é a variabilidade da dieta, que pode gerar uma grande variabilidade inter e intrapessoal. Para contornar esse problema, métodos estatísticos apropriados foram desenvolvidos baseados no modelo de regressão com erro de medida de modo a se obter a distribuição estimada do consumo habitual. Dentre os métodos de avaliação de consumo, têm-se o método da Iowa State University (ISU), o método da Iowa State University for Foods (I8UF) e o método do National Cancer Institute (NCI). Todos esses métodos são baseados no modelo com erro de medida incorporando a questão do consumo esporádico (método I8UF) e a possibilidade de incluir covariáveis que podem interferir na distribuição estimada do consumo (método NCI). Para o uso do método ISU, foi desenvolvido um programa chamado PC-SIDE (Software for Intake Distribution Estimate), que fornece a distribuição do consumo habitual bem como a probabilidade de inadequação de determinados nutrientes de acordo com as recomendações nutricionais. Com esse mesmo programa, é possível obter a distribuição do consumo habitual esporádico, dado pelo método ISUF. Para o método NCI, foram desenvolvidas macros no programa SAS (Statistical Analysis System) que permitem incluir covariáveis e estimar a distribuição do consumo habitual baseado no modelo com erros de medidas. Desse modo, o objetivo deste trabalho foi avaliar essas metodologias estatísticas na análise de dados de consumo alimentar e aplicá-los a um conjunto de dados de um levantamento nutricional de idosos. Foram estudadas as metodologias de ajuste dos modelos propostos para a obtenção da distribuição estimada de consumo baseado... (Resumo completo, clicar acesso eletrônico abaixo) / Abstract: Evaluating an individual's or a population's food intake has been a challenge for both nutrition professionals and statisticians. This is so because the main characteristic of food intake is diet variety, which may generate large betweenand within-person variability. In order to overcome this problem, appropriate statistical methods have been developed based on the measurement-error regression model so as to obtain the estimated distribution of usual intake. Among the intake evaluation methods are the Iowa State University (lSU), the lowa State University for Foods (ISUF) and the National Cancer lnstitute (NCI) methods. All of them are based on the measurement- error model incorporating the issue concerning episodic intake (ISUF method) and the possibility of including covariates that can interfere in the intake estimated distribution (NCl method). ln order to use the lSU method, a software referred to as PC-SlDE (Software for Intake Distribution Estimate) was designed. It provides the usual intake distribution as well as the probability of inadequacy for certain nutrients according to nutritional recommendations. By using the same software, it is possible to obtain the distribution of episodic usual intake given by the ISUF method. For the NCI method, macros were developed in the SAS (Statistical Analysis System) software which enable the inclusion of covariates and the estimation of the usual intake distribution based on the measurement-error IDodel. Hence, this study aimed at evaluating these statistical methodologies in the analysis of food intake data and at applying them to a data set for a nutritional assessment of elderly individuaIs. The fitting methodologies for the models proposed to obtain the estimated intake distribution based on the ISU, ISUF and NCI methods were studied. The ISU and NCI methods were applied to data from three 24-hours recalls obtained fram a study... (Complete abstract click electronic access below) / Mestre
20

Avaliação do erro de reprodutibilidade dos valores cefalometricos aplicados na filosofia Tweed-Merrifield, pelos metodos computadorizado e convencional

Albuquerque Junior, Haroldo Rodrigues de 25 October 1996 (has links)
Orientador: Maria Helena Castro de Almeida / Dissertação (mestrado) - Universidade Estadual de Campinas, Faculdade de Odontologia de Piracicaba / Made available in DSpace on 2018-07-21T19:43:02Z (GMT). No. of bitstreams: 1 AlbuquerqueJunior_HaroldoRodriguesde_M.pdf: 10153288 bytes, checksum: 8daf1469f09aa0290c03e2363edd7262 (MD5) Previous issue date: 1996 / Resumo: Foi realizado um estudo para avaliar o erro de reprodutibilidade dos valores cefalométricos usados na filosofia Tweed- Merrifield, tanto pelo método computadorizado quanto pelo método convencional. Utilizou-se uma amostra de trinta telerradiografias da cabeça em norma lateral, do arquivo do Curso de Pós Graduação em ortodontia da FOP-UNICAMP. Dois operadores realizaram os traçados e mensurações em dois momentos, com intervalo de trinta dias entre cada conjunto de traçados. Para avaliar o efeito sistemático dos fatores em estudo, realizou-se uma análise de variância para cada variável investigada, levando-se em consideração a dependência gerada pela realização de diversas medidas nas radiografias de cada paciente, assim como foram feitos testes t-Student. Para estimar os erros aleatórios, determinou-se a variância do erro, fórmula de Dahlberg, para combinação de método e operador, e para cada medida. Como alternativa de avaliar a contribuição dos erros aleatórios na reprodutibilidade das medidas, calculou-se o coeficiente de confiabilidade. A investigação demonstrou que os erros na cefalometria radiográfica, inevitavelmente, ocorrem, havendo uma interferência significante do fator operador na reprodutibilidade das medidas. O método computadorizado foi confiável, apresentando variâncias do erro menores que as do método convencional. As medidas FMIA e IMP A manifestaram as maiores possibilidades de erro, sendo essencial, dentro da filosofia em estudo, a replicação de traçados, para tomadas de decisões seguras / Abstract: It was made a study to evaluate the error of the reproducibility of the cephalometrics values proposed by Tweed-Merrifield philosophy, according to computerized and conventional methods. A sample of thirty lateral cephalometric plates from the post-graduation orthodontics records of the State University of Campinas was analysed. Two different operators made the measuring in two moments, with thirty days interval between each measuring trial. To evaluate the systematic effect of factors under study, variance; analysis was made for each variable investigated considering the generated dependence by the accomplishment of several measures in the cephalometric plates of each patient, as well as, it was made t-Student tests. Error variance was determined to estimate the casual errors using Dahlberg's formula, to combine method and operator, and for each measure. Coefficient of reliability was also calculated as an alternative to evaluate the contribution of casual errors in the reproducibility of measures. Data analysis seems to reveal that errors in cephalometrics occurred, mainly as a result of the interference of the factor operador in the reproducibility of the measures. The computerized method was reliable, showing smaller errors variances than the conventional method. The measures FMIA and IMPA showed the greatest errors possibilities, being essential, into the philosophy in study, the replication of tracings, to take safe decisions / Mestrado / Ortodontia / Mestre em Odontologia

Page generated in 0.1026 seconds