• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 9
  • 7
  • 2
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 29
  • 29
  • 10
  • 7
  • 5
  • 5
  • 5
  • 5
  • 5
  • 5
  • 4
  • 4
  • 3
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Effects of management errors on construction projects

Wantanakorn, Danai January 2000 (has links)
No description available.
2

A Predictive Model for Multi-Band Optical Tracking System (MBOTS) Performance

Horii, M. Michael 10 1900 (has links)
ITC/USA 2013 Conference Proceedings / The Forty-Ninth Annual International Telemetering Conference and Technical Exhibition / October 21-24, 2013 / Bally's Hotel & Convention Center, Las Vegas, NV / In the wake of sequestration, Test and Evaluation (T&E) groups across the U.S. are quickly learning to make do with less. For Department of Defense ranges and test facility bases in particular, the timing of sequestration could not be worse. Aging optical tracking systems are in dire need of replacement. What's more, the increasingly challenging missions of today require advanced technology, flexibility, and agility to support an ever-widening spectrum of scenarios, including short-range (0 − 5 km) imaging of launch events, long-range (50 km+) imaging of debris fields, directed energy testing, high-speed tracking, and look-down coverage of ground test scenarios, to name just a few. There is a pressing need for optical tracking systems that can be operated on a limited budget with minimal resources, staff, and maintenance, while simultaneously increasing throughput and data quality. Here we present a mathematical error model to predict system performance. We compare model predictions to site-acceptance test results collected from a pair of multi-band optical tracking systems (MBOTS) fielded at White Sands Missile Range. A radar serves as a point of reference to gauge system results. The calibration data and the triangulation solutions obtained during testing provide a characterization of system performance. The results suggest that the optical tracking system error model adequately predicts system performance, thereby supporting pre-mission analysis and conserving scarce resources for innovation and development of robust solutions. Along the way, we illustrate some methods of time-space-position information (TSPI) data analysis, define metrics for assessing system accuracy, and enumerate error sources impacting measurements. We conclude by describing technical challenges ahead and identifying a path forward.
3

Avaliação de métodos estatísticos na análise de dados de consumo alimentar

Paschoalinotte, Eloisa Elena [UNESP] 17 December 2009 (has links) (PDF)
Made available in DSpace on 2014-06-11T19:30:59Z (GMT). No. of bitstreams: 0 Previous issue date: 2009-12-17Bitstream added on 2014-06-13T21:01:49Z : No. of bitstreams: 1 paschoalinotte_ee_me_botib.pdf: 355804 bytes, checksum: f6f7da3741a371f0a44fb543773dfea3 (MD5) / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES) / Universidade Estadual Paulista (UNESP) / A avaliação do consumo alimentar de um indivíduo ou de uma população tem sido um desafio tanto para profissionais da área de nutrição como para os da área de estatística. Isso porque a característica central do consumo alimentar é a variabilidade da dieta, que pode gerar uma grande variabilidade inter e intrapessoal. Para contornar esse problema, métodos estatísticos apropriados foram desenvolvidos baseados no modelo de regressão com erro de medida de modo a se obter a distribuição estimada do consumo habitual. Dentre os métodos de avaliação de consumo, têm-se o método da Iowa State University (ISU), o método da Iowa State University for Foods (I8UF) e o método do National Cancer Institute (NCI). Todos esses métodos são baseados no modelo com erro de medida incorporando a questão do consumo esporádico (método I8UF) e a possibilidade de incluir covariáveis que podem interferir na distribuição estimada do consumo (método NCI). Para o uso do método ISU, foi desenvolvido um programa chamado PC-SIDE (Software for Intake Distribution Estimate), que fornece a distribuição do consumo habitual bem como a probabilidade de inadequação de determinados nutrientes de acordo com as recomendações nutricionais. Com esse mesmo programa, é possível obter a distribuição do consumo habitual esporádico, dado pelo método ISUF. Para o método NCI, foram desenvolvidas macros no programa SAS (Statistical Analysis System) que permitem incluir covariáveis e estimar a distribuição do consumo habitual baseado no modelo com erros de medidas. Desse modo, o objetivo deste trabalho foi avaliar essas metodologias estatísticas na análise de dados de consumo alimentar e aplicá-los a um conjunto de dados de um levantamento nutricional de idosos. Foram estudadas as metodologias de ajuste dos modelos propostos para a obtenção da distribuição estimada de consumo baseado... / Evaluating an individual's or a population's food intake has been a challenge for both nutrition professionals and statisticians. This is so because the main characteristic of food intake is diet variety, which may generate large betweenand within-person variability. In order to overcome this problem, appropriate statistical methods have been developed based on the measurement-error regression model so as to obtain the estimated distribution of usual intake. Among the intake evaluation methods are the Iowa State University (lSU), the lowa State University for Foods (ISUF) and the National Cancer lnstitute (NCI) methods. All of them are based on the measurement- error model incorporating the issue concerning episodic intake (ISUF method) and the possibility of including covariates that can interfere in the intake estimated distribution (NCl method). ln order to use the lSU method, a software referred to as PC-SlDE (Software for Intake Distribution Estimate) was designed. It provides the usual intake distribution as well as the probability of inadequacy for certain nutrients according to nutritional recommendations. By using the same software, it is possible to obtain the distribution of episodic usual intake given by the ISUF method. For the NCI method, macros were developed in the SAS (Statistical Analysis System) software which enable the inclusion of covariates and the estimation of the usual intake distribution based on the measurement-error IDodel. Hence, this study aimed at evaluating these statistical methodologies in the analysis of food intake data and at applying them to a data set for a nutritional assessment of elderly individuaIs. The fitting methodologies for the models proposed to obtain the estimated intake distribution based on the ISU, ISUF and NCI methods were studied. The ISU and NCI methods were applied to data from three 24-hours recalls obtained fram a study... (Complete abstract click electronic access below)
4

Avaliação de métodos estatísticos na análise de dados de consumo alimentar /

Paschoalinotte, Eloisa Elena. January 2009 (has links)
Orientador: José Eduardo Corrente / Banca: Dirce Maria Lobo Marchioni / Banca: Lídia Raquel de Carvalho / Banca: Liciana Vaz de Arruda Silveira / Banca: Regina Mara Fisberg / Resumo: A avaliação do consumo alimentar de um indivíduo ou de uma população tem sido um desafio tanto para profissionais da área de nutrição como para os da área de estatística. Isso porque a característica central do consumo alimentar é a variabilidade da dieta, que pode gerar uma grande variabilidade inter e intrapessoal. Para contornar esse problema, métodos estatísticos apropriados foram desenvolvidos baseados no modelo de regressão com erro de medida de modo a se obter a distribuição estimada do consumo habitual. Dentre os métodos de avaliação de consumo, têm-se o método da Iowa State University (ISU), o método da Iowa State University for Foods (I8UF) e o método do National Cancer Institute (NCI). Todos esses métodos são baseados no modelo com erro de medida incorporando a questão do consumo esporádico (método I8UF) e a possibilidade de incluir covariáveis que podem interferir na distribuição estimada do consumo (método NCI). Para o uso do método ISU, foi desenvolvido um programa chamado PC-SIDE (Software for Intake Distribution Estimate), que fornece a distribuição do consumo habitual bem como a probabilidade de inadequação de determinados nutrientes de acordo com as recomendações nutricionais. Com esse mesmo programa, é possível obter a distribuição do consumo habitual esporádico, dado pelo método ISUF. Para o método NCI, foram desenvolvidas macros no programa SAS (Statistical Analysis System) que permitem incluir covariáveis e estimar a distribuição do consumo habitual baseado no modelo com erros de medidas. Desse modo, o objetivo deste trabalho foi avaliar essas metodologias estatísticas na análise de dados de consumo alimentar e aplicá-los a um conjunto de dados de um levantamento nutricional de idosos. Foram estudadas as metodologias de ajuste dos modelos propostos para a obtenção da distribuição estimada de consumo baseado... (Resumo completo, clicar acesso eletrônico abaixo) / Abstract: Evaluating an individual's or a population's food intake has been a challenge for both nutrition professionals and statisticians. This is so because the main characteristic of food intake is diet variety, which may generate large betweenand within-person variability. In order to overcome this problem, appropriate statistical methods have been developed based on the measurement-error regression model so as to obtain the estimated distribution of usual intake. Among the intake evaluation methods are the Iowa State University (lSU), the lowa State University for Foods (ISUF) and the National Cancer lnstitute (NCI) methods. All of them are based on the measurement- error model incorporating the issue concerning episodic intake (ISUF method) and the possibility of including covariates that can interfere in the intake estimated distribution (NCl method). ln order to use the lSU method, a software referred to as PC-SlDE (Software for Intake Distribution Estimate) was designed. It provides the usual intake distribution as well as the probability of inadequacy for certain nutrients according to nutritional recommendations. By using the same software, it is possible to obtain the distribution of episodic usual intake given by the ISUF method. For the NCI method, macros were developed in the SAS (Statistical Analysis System) software which enable the inclusion of covariates and the estimation of the usual intake distribution based on the measurement-error IDodel. Hence, this study aimed at evaluating these statistical methodologies in the analysis of food intake data and at applying them to a data set for a nutritional assessment of elderly individuaIs. The fitting methodologies for the models proposed to obtain the estimated intake distribution based on the ISU, ISUF and NCI methods were studied. The ISU and NCI methods were applied to data from three 24-hours recalls obtained fram a study... (Complete abstract click electronic access below) / Mestre
5

An error methodology based on surface observations to compute the top of the atmosphere, clear-sky shortwave flux model errors

Anantharaj, Valentine (Valentine Gunasekaran) 01 May 2010 (has links)
Global Climate Models (GCMs) are indispensable tools for modeling climate change projections. Due to approximations, errors are introduced in the GCM computations of atmospheric radiation. The existing methodologies for the comparison of the GCM-computed shortwave fluxes (SWF) exiting the top of the atmosphere (TOA) against satellite observations do not separate the model errors in terms of the atmospheric and surface components. A new methodology has been developed for estimating the GCM systematic errors in the SWF at the TOA under clear-sky (CS) conditions. The new methodology is based on physical principles and utilizes in-situ measurements of SWF at the surface. This error adjustment methodology (EAM) has been validated by comparing GCM results against satellite measurements from the Clouds and the Earth’s Radiant Energy System (CERES) mission. The EAM was implemented in an error estimation model for solar radiation (EEMSR), and then applied to examine the hypothesis that the Community Climate System Model (CCSM), one of the most widely used GCMs, was deficient in representing the annual phenology of vegetation in many areas, and that satellite measurements of vegetation characteristics offered the means to rectify the problem. The CCSM computed monthly climatologies of TOA-CS-SWF were compared to the CERES climatology. The incorporation of satellite-derived land surface parameters improved the TOA SWF in many regions. However, for more meaningful interpretations of the comparisons, it was necessary to account for the uncertainties arising from the radiation calculations of CCSM. In-situ measurements from two sites were used by EMBC to relate the observations and model estimates via a predictive equation to derive the errors in TOA CS-SWF for monthly climatologies. The model climatologies were adjusted using the computed error and then compared to CERES climatology at the two sites. The new results showed that at one of the sites, CCSM consistently overestimated the atmospheric transmissivity whereas at the other site the CCSM overestimated during the spring, summer and early fall and underestimated during late fall and winter. The bias adjustment using the EMBC helped determine more clearly that at the two sites the utilization of satellite-derived land surface parameters improved the TOA CS-SWF.
6

Structural adaptive models in financial econometrics

Mihoci, Andrija 05 October 2012 (has links)
Moderne statistische und ökonometrische Methoden behandeln erfolgreich stilisierte Fakten auf den Finanzmärkten. Die vorgestellten Techniken erstreben die Dynamik von Finanzmarktdaten genauer als traditionelle Ansätze zu verstehen. Wirtschaftliche und finanzielle Vorteile sind erzielbar. Die Ergebnisse werden hier in praktischen Beispielen ausgewertet, die sich vor allem auf die Prognose von Finanzmarktdaten fokussieren. Unsere Anwendungen umfassen: (i) die Modellierung und die Vorhersage des Liquiditätsangebotes, (ii) die Lokalisierung des ’Multiplicative Error Model’ und (iii) die Erbringung von Evidenz für den empirischen Zustandsfaktorparadox über Landern. / Modern methods in statistics and econometrics successfully deal with stylized facts observed on financial markets. The presented techniques aim to understand the dynamics of financial market data more accurate than traditional approaches. Economic and financial benefits are achievable. The results are here evaluated in practical examples that mainly focus on forecasting of financial data. Our applications include: (i) modelling and forecasting of liquidity supply, (ii) localizing multiplicative error models and (iii) providing evidence for the empirical pricing kernel paradox across countries.
7

Investigating methods to improve sensitivity of the Apparent Diffusion Coefficient, a potential imaging biomarker of treatment response, for patients with colorectal liver metastasis

Pathak, Ryan January 2018 (has links)
Radiological imaging already has a key role in the detection and management of patients with metastatic colorectal cancer (mCRC). With the evolution of personalised medicine there is a need for non-invasive imaging biomarkers that can detect early tumour response to targeted therapies. Translation from bench to bedside requires a multicentre approach that follows an agreed development roadmap to ensure that the proposed biomarker is precise (reproducible/ repeatable) and accurate in its characterisation of a meaningful physiological, pathological or post treatment response. The following thesis (organized in the alternative format with experimental studies written as individual complete manuscripts) investigates methods to improve precision and accuracy of the Apparent Diffusion Coefficient (ADC), a proposed quantitative imaging biomarker with a potential role in characterisation of post treatment responses in mCRC. The first objective was to establish baseline multicentre reproducibility (n=20) for ADC. A change in ADC greater than 21.1% was required to determine a post treatment response. Using a statistical error model, the dominating factors that influenced reproducibility were motion artefact and tumour volume. In the second study these factors were addressed using a single centre cohort with pre and post treatment data. Correcting for errors due to motion and tumour volume improved sensitivity from 30.3% to 1.7%, so a post treatment response was detected in 6/12 tumours compared to 0/12 using the baseline approach. In the third study, motion correction was implemented and the statistical error model was applied successfully to a multicentre cohort of 15 patients (1.9% sensitivity). The results of this thesis highlights that with careful consideration and correction of factors that negatively influence sensitivity, ADC is a potential imaging biomarker for use in post treatment response for patients with mCRC.
8

A Comparison of the Stability of Measures of Personality Traits, Self-esteem, Affective Well-being, and Cognitive Well-being

Anusic, Ivana 24 February 2009 (has links)
A variety of statistical models have been developed to examine longitudinal stability and change of individual differences. Kenny and Zautra’s (1995) trait-state-error (TSE) model models stability as a function of stable variance that does not change (trait), moderately stable variance that changes over time (state), and error variance. Applications of this model have been limited to panel studies with repeated observations of the same individual. The present study developed a non-linear regression model to apply the TSE model to retest correlations from different samples. This model was used to compare the stability of measures of personality traits, affective well-being (AWB), cognitive well-being (CWB), and self-esteem. After correcting for differences in reliability, age, gender, and scale length, the amount of trait (vs. state) variance was similar across constructs. Stability of the state component was highest for AWB and CWB, suggesting that situational influences have most enduring effects on these two constructs.
9

A Comparison of the Stability of Measures of Personality Traits, Self-esteem, Affective Well-being, and Cognitive Well-being

Anusic, Ivana 24 February 2009 (has links)
A variety of statistical models have been developed to examine longitudinal stability and change of individual differences. Kenny and Zautra’s (1995) trait-state-error (TSE) model models stability as a function of stable variance that does not change (trait), moderately stable variance that changes over time (state), and error variance. Applications of this model have been limited to panel studies with repeated observations of the same individual. The present study developed a non-linear regression model to apply the TSE model to retest correlations from different samples. This model was used to compare the stability of measures of personality traits, affective well-being (AWB), cognitive well-being (CWB), and self-esteem. After correcting for differences in reliability, age, gender, and scale length, the amount of trait (vs. state) variance was similar across constructs. Stability of the state component was highest for AWB and CWB, suggesting that situational influences have most enduring effects on these two constructs.
10

Deconvolution in Random Effects Models via Normal Mixtures

Litton, Nathaniel A. 2009 August 1900 (has links)
This dissertation describes a minimum distance method for density estimation when the variable of interest is not directly observed. It is assumed that the underlying target density can be well approximated by a mixture of normals. The method compares a density estimate of observable data with a density of the observable data induced from assuming the target density can be written as a mixture of normals. The goal is to choose the parameters in the normal mixture that minimize the distance between the density estimate of the observable data and the induced density from the model. The method is applied to the deconvolution problem to estimate the density of $X_{i}$ when the variable $% Y_{i}=X_{i}+Z_{i}$, $i=1,\ldots ,n$, is observed, and the density of $Z_{i}$ is known. Additionally, it is applied to a location random effects model to estimate the density of $Z_{ij}$ when the observable quantities are $p$ data sets of size $n$ given by $X_{ij}=\alpha _{i}+\gamma Z_{ij},~i=1,\ldots ,p,~j=1,\ldots ,n$, where the densities of $\alpha_{i} $ and $Z_{ij}$ are both unknown. The performance of the minimum distance approach in the measurement error model is compared with the deconvoluting kernel density estimator of Stefanski and Carroll (1990). In the location random effects model, the minimum distance estimator is compared with the explicit characteristic function inversion method from Hall and Yao (2003). In both models, the methods are compared using simulated and real data sets. In the simulations, performance is evaluated using an integrated squared error criterion. Results indicate that the minimum distance methodology is comparable to the deconvoluting kernel density estimator and outperforms the explicit characteristic function inversion method.

Page generated in 0.0455 seconds