• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 5
  • 3
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 20
  • 20
  • 7
  • 6
  • 6
  • 5
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Interval Estimation for Binomial Proportion, Poisson Mean, and Negative –binomial Mean

Liu, Luchen January 2012 (has links)
This paper studies the interval estimation of three discrete distributions: thebinomial distribution, the Poisson distribution and the negative-binomialdistribution. The problem is the chaotic behavior of the coverage probabilityfor the Wald interval. To solve this problem, alternative confidence intervals areintroduced. Coverage probability and expected length are chosen to be thecriteria evaluating the intervals.In this paper, I firstly tested the chaotic behavior of the coverageprobability for the Wald interval, and introduced the alternative confidenceintervals. Then I calculated the coverage probability and expected length forthose intervals, made comparisons and recommended confidence intervals forthe three cases. This paper also discussed the relationship among the threediscrete distributions, and in the end illustrated the applications on binomialand Poisson data with brief examples.
2

Obtaining Accurate Estimates of the Mediated Effect with and without Prior Information

January 2014 (has links)
abstract: Research methods based on the frequentist philosophy use prior information in a priori power calculations and when determining the necessary sample size for the detection of an effect, but not in statistical analyses. Bayesian methods incorporate prior knowledge into the statistical analysis in the form of a prior distribution. When prior information about a relationship is available, the estimates obtained could differ drastically depending on the choice of Bayesian or frequentist method. Study 1 in this project compared the performance of five methods for obtaining interval estimates of the mediated effect in terms of coverage, Type I error rate, empirical power, interval imbalance, and interval width at N = 20, 40, 60, 100 and 500. In Study 1, Bayesian methods with informative prior distributions performed almost identically to Bayesian methods with diffuse prior distributions, and had more power than normal theory confidence limits, lower Type I error rates than the percentile bootstrap, and coverage, interval width, and imbalance comparable to normal theory, percentile bootstrap, and the bias-corrected bootstrap confidence limits. Study 2 evaluated if a Bayesian method with true parameter values as prior information outperforms the other methods. The findings indicate that with true values of parameters as the prior information, Bayesian credibility intervals with informative prior distributions have more power, less imbalance, and narrower intervals than Bayesian credibility intervals with diffuse prior distributions, normal theory, percentile bootstrap, and bias-corrected bootstrap confidence limits. Study 3 examined how much power increases when increasing the precision of the prior distribution by a factor of ten for either the action or the conceptual path in mediation analysis. Power generally increases with increases in precision but there are many sample size and parameter value combinations where precision increases by a factor of 10 do not lead to substantial increases in power. / Dissertation/Thesis / Masters Thesis Psychology 2014
3

Exploration of robust software sensor techniques with applications in vehicle positioning and bioprocess state estimation

Goffaux, Guillaume 05 February 2010 (has links)
Résumé : Le travail réalisé au cours de cette thèse traite de la mise au point de méthodes d’estimation d’état robuste, avec deux domaines d’application en ligne de mire. Le premier concerne le positionnement sécuritaire en transport. L’objectif est de fournir la position et la vitesse du véhicule sous la forme d’intervalles avec un grand degré de confiance. Le second concerne la synthèse de capteurs logiciels pour les bioprocédés, et en particulier la reconstruction des concentrations de composants réactionnels à partir d’un nombre limité de mesures et d’un modèle mathématique interprétant le comportement dynamique de ces composants. L’objectif principal est de concevoir des algorithmes qui puissent fournir des estimations acceptables en dépit des incertitudes provenant de la mauvaise connaissance du système comme les incertitudes sur les paramètres du modèle ou les incertitudes de mesures. Dans ce contexte, plusieurs algorithmes ont été étudiés et mis au point. Ainsi, dans le cadre du positionnement de véhicule, la recherche s’est dirigée vers les méthodes robustes Hinfini et les méthodes par intervalles. Les méthodes Hinfini sont des méthodes linéaires prenant en compte une incertitude dans la modélisation et réalisant une optimisation min-max, c’est-à-dire minimisant une fonction de coût qui représente la pire situation compte tenu des incertitudes paramétriques. La contribution de ce travail concerne l’extension à des modèles faiblement non linéaires et l’utilisation d’une fenêtre glissante pour faire face à des mesures asynchrones. Les méthodes par intervalles développées ont pour but de calculer les couloirs de confiance des variables position et vitesse en se basant sur la combinaison d’intervalles issus des capteurs d’une part et sur l’utilisation conjointe d’un modèle dynamique et cinématique du véhicule d’autre part. Dans le cadre des capteurs logiciels pour bioprocédés, trois familles de méthodes ont été étudiées: le filtrage particulaire, les méthodes par intervalles et le filtrage par horizon glissant. Le filtrage particulaire est basé sur des méthodes de Monte-Carlo pour estimer la densité de probabilité conditionnelle de l’état connaissant les mesures. Un de ses principaux inconvénients est sa sensibilité aux erreurs paramétriques. La méthode développée s’applique aux bioprocédés et profite de la structure particulière des modèles pour proposer une version du filtrage particulaire robuste aux incertitudes des paramètres cinétiques. Des méthodes d’estimation par intervalles sont adaptées à la situation où les mesures sont disponibles à des instants discrets, avec une faible fréquence d’échantillonnage, en développant des prédicteurs appropriés. L’utilisation d’un faisceau de prédicteurs grâce à des transformations d’état et le couplage entre les prédicteurs avec des réinitialisations fréquentes permettent d’améliorer les résultats d’estimation. Enfin, une méthode basée sur le filtre à horizon glissant est étudiée en effectuant une optimisation min-max : la meilleure condition initiale est reconstruite pour le plus mauvais modèle. Des solutions sont aussi proposées pour minimiser la quantité de calculs. Pour conclure, les méthodes et résultats obtenus constituent un ensemble d’améliorations dans le cadre de la mise au point d’algorithmes robustes vis-à-vis des incertitudes. Selon les applications et les objectifs fixés, telle ou telle famille de méthodes sera privilégiée. Cependant, dans un souci de robustesse, il est souvent utile de fournir les estimations sous forme d’intervalles auxquels est associé un niveau de confiance lié aux conditions de l’estimation. C’est pourquoi, une des méthodes les plus adaptées aux objectifs de robustesse est représentée par les méthodes par intervalles de confiance et leur développement constituera un point de recherche futur. __________________________________________ Abstract : This thesis work is about the synthesis of robust state estimation methods applied to two different domains. The first area is dedicated to the safe positioning in transport. The objective is to compute the vehicle position and velocity by intervals with a great confidence level. The second area is devoted to the software sensor design in bioprocess applications. The component concentrations are estimated from a limited number of measurements and a mathematical model describing the dynamical behavior of the system. The main interest is to design algorithms which achieve estimation performance and take uncertainties into account coming from the model parameters and the measurement errors. In this context, several algorithms have been studied and designed. Concerning the vehicle positioning, the research activities have led to robust Hinfinity methods and interval estimation methods. The robust Hinfinity methods use a linear model taking model uncertainty into account and perform a min-max optimization, minimizing a cost function which describes the worst-case configuration. The contribution in this domain is an extension to some systems with a nonlinear model and the use of a receding time window facing with asynchronous data. The developed interval algorithms compute confidence intervals of the vehicle velocity and position. They use interval combinations by union and intersection operations obtained from sensors along with kinematic and dynamic models. In the context of bioprocesses, three families of state estimation methods have been investigated: particle filtering, interval methods and moving-horizon filtering. The particle filtering is based on Monte-Carlo drawings to estimate the posterior probability density function of the state variables knowing the measurements. A major drawback is its sensitivity to model uncertainties. The proposed algorithm is dedicated to bioprocess applications and takes advantage of the characteristic structure of the models to design an alternative version of the particle filter which is robust to uncertainties in the kinetic terms. Moreover, interval observers are designed in the context of bioprocesses. The objective is to extend the existing methods to discrete-time measurements by developing interval predictors. The use of a bundle of interval predictors thanks to state transformations and the use of the predictor coupling with reinitializations improve significantly the estimation performance. Finally, a moving-horizon filter is designed, based on a min-max optimization problem. The best initial conditions are generated from the model using the worst parameter configuration. Furthermore, additional solutions have been provided to reduce the computational cost. To conclude, the developed algorithms and related results can be seen as improvements in the design of estimation methods which are robust to uncertainties. According to the application and the objectives, a family may be favored. However, in order to satisfy some robustness criteria, an interval is preferred along with a measure of the confidence level describing the conditions of the estimation. That is why, the development of confidence interval observers represents an important topic in the future fields of investigation.
4

Confidence Intervals for Mean Absolute Deviations

Bonett, Douglas G., Seier, Edith 01 January 2003 (has links)
The mean absolute deviation is a simple and informative measure of variability. Approximate confidence intervals for mean absolute deviations in one-group and two-group designs are derived and are shown to have excellent small-sample properties under moderate nonnormality. Sample size planning formulas are derived.
5

Numerická studie simultánních rovnic / Numerical study on simultanious equations

Šaroch, Vojtěch January 2014 (has links)
Title: Numerical study on simultanious equations Author: Vojtěch Šaroch Department: Department of Probability and Mathematical Statistics Supervisor: doc. RNDr. Petr Lachout, CSc. Abstract: In this thesis we deal with the simultaneous equation model. In the first chapter we introduce theoretical aspect of this problem, especially estimation procedures and their properties. We mention issues of an identification and an inconsistency of OLS-estimates for simultaneous modeling. In th second chapter we introduce theory of estimation, especially we will focus on the interval estimation and precision. We mention empirical approach too. In the third chapter we perform a numerical study on the simple macroeconomic model of generated dates. We are interested in properties of interval estimations of parameters, the convergence rate, difference between the empirical and theoretical extimation etc. Keywords: simultaneous equations model, interval estimation, empirical estimation 1
6

Numerická studie simultánních rovnic / Numerical study on simultanious equations

Šaroch, Vojtěch January 2014 (has links)
Title: Numerical study on simultanious equations Author: Vojtěch Šaroch Department: Department of Probability and Mathematical Statistics Supervisor: doc. RNDr. Petr Lachout, CSc. Abstract: In this thesis we deal with simultaneous equation model. In first chapter we introduce to theoretical aspect of this problem, especially estimation procedures and their properties. We mention issues of identification and an inconsistency of OLS-estimates for the simultaneous modeling. In second chapter we introduce theory of estimation, especially we will focus on interval estimation and precision. We mention empirical approach too. In the third chapter we perform a numerical study on simple macroeconomic model on generated dates. We are interested in properties interval estimations of parameters, the convergence rate, difference between empirical and theoretical extimation etc. Keywords: simultaneous equations model, interval estimation, empirical estimation 1
7

Learning Curves in Emergency Ultrasonography

Brady, Kaitlyn 29 December 2012 (has links)
"This project utilized generalized estimating equations and general linear modeling to model learning curves for sonographer performance in emergency ultrasonography. Performance was measured in two ways: image quality (interpretable vs. possible hindrance in interpretation) and agreement of findings between the sonographer and an expert reviewing sonographer. Records from 109 sonographers were split into two data sets-- training (n=50) and testing (n=59)--to conduct exploratory analysis and fit the final models for analysis, respectively. We determined that the number of scans of a particular exam type required for a sonographer to obtain quality images on that exam type with a predicted probability of 0.9 is highly dependent upon the person conducting the review, the indication of the scan (educational or medical), and the outcome of the scan (whether there is a pathology positive finding). Constructing family-wise 95% confidence intervals for each exam type demonstrated a large amount of variation for the number of scans required both between exam types and within exam types. It was determined that a sonographer's experience with a particular exam type is not a significant predictor of future agreement on that exam type and thus no estimates were made based on the agreement learning curves. In addition, we concluded based on a type III analysis that when already considering exam type related experience, the consideration of experience on other exam types does not significantly impact the learning curve for quality. However, the learning curve for agreement is significantly impacted by the additional consideration of experience on other exam types."
8

The Development and Evaluation of a Model of Time-of-arrival Uncertainty

Hooey, Becky 13 April 2010 (has links)
Uncertainty is inherent in complex socio-technical systems such as in aviation, military, and surface transportation domains. An improved understanding of how operators comprehend this uncertainty is critical to the development of operations and technology. Towards the development of a model of time of arrival (TOA) uncertainty, Experiment 1 was conducted to determine how air traffic controllers estimate TOA uncertainty and to identify sources of TOA uncertainty. The resulting model proposed that operators first develop a library of speed and TOA profiles through experience. As they encounter subsequent aircraft, they compare each vehicle’s speed profile to their personal library and apply the associated estimate of TOA uncertainty. To test this model, a normative model was adopted to compare inferences made by human observers to the corresponding inferences that would be made by an optimal observer who had knowledge of the underlying distribution. An experimental platform was developed and implemented in which subjects observed vehicles with variable speeds and then estimated the mean and interval that captured 95% of the speeds and TOAs. Experiments 2 and 3 were then conducted and revealed that subjects overestimated TOA intervals for fast stimuli and underestimated TOA intervals for slow stimuli, particularly when speed variability was high. Subjects underestimated the amount of positive skew of the TOA distribution, particularly in slow/high variability conditions. Experiment 3 also demonstrated that subjects overestimated TOA uncertainty for short distances and underestimated TOA uncertainty for long distances. It was shown that subjects applied a representative heuristic by selecting the trained speed profile that was most similar to the observed vehicle’s profile, and applying the TOA uncertainty estimate of that trained profile. Multiple regression analyses revealed that the task of TOA uncertainty estimation contributed the most to TOA uncertainty estimation error as compared to the tasks of building accurate speed models and identifying the appropriate speed model to apply to a stimulus. Two systematic biases that account for the observed TOA uncertainty estimation errors were revealed: Assumption of symmetry and aversion to extremes. Operational implications in terms of safety and efficiency for the aviation domain are discussed.
9

The Development and Evaluation of a Model of Time-of-arrival Uncertainty

Hooey, Becky 13 April 2010 (has links)
Uncertainty is inherent in complex socio-technical systems such as in aviation, military, and surface transportation domains. An improved understanding of how operators comprehend this uncertainty is critical to the development of operations and technology. Towards the development of a model of time of arrival (TOA) uncertainty, Experiment 1 was conducted to determine how air traffic controllers estimate TOA uncertainty and to identify sources of TOA uncertainty. The resulting model proposed that operators first develop a library of speed and TOA profiles through experience. As they encounter subsequent aircraft, they compare each vehicle’s speed profile to their personal library and apply the associated estimate of TOA uncertainty. To test this model, a normative model was adopted to compare inferences made by human observers to the corresponding inferences that would be made by an optimal observer who had knowledge of the underlying distribution. An experimental platform was developed and implemented in which subjects observed vehicles with variable speeds and then estimated the mean and interval that captured 95% of the speeds and TOAs. Experiments 2 and 3 were then conducted and revealed that subjects overestimated TOA intervals for fast stimuli and underestimated TOA intervals for slow stimuli, particularly when speed variability was high. Subjects underestimated the amount of positive skew of the TOA distribution, particularly in slow/high variability conditions. Experiment 3 also demonstrated that subjects overestimated TOA uncertainty for short distances and underestimated TOA uncertainty for long distances. It was shown that subjects applied a representative heuristic by selecting the trained speed profile that was most similar to the observed vehicle’s profile, and applying the TOA uncertainty estimate of that trained profile. Multiple regression analyses revealed that the task of TOA uncertainty estimation contributed the most to TOA uncertainty estimation error as compared to the tasks of building accurate speed models and identifying the appropriate speed model to apply to a stimulus. Two systematic biases that account for the observed TOA uncertainty estimation errors were revealed: Assumption of symmetry and aversion to extremes. Operational implications in terms of safety and efficiency for the aviation domain are discussed.
10

Addressing Pre-Service Teachers' Misconceptions About Confidence Intervals

Eliason, Kiya Lynn 01 June 2018 (has links)
Increased attention to statistical concepts has been a prevalent trend in revised mathematics curricula across grade levels. However, the preparation of secondary school mathematics educators has not received similar attention, and learning opportunities provided to these educators is oftentimes insufficient for teaching statistics well. The purpose of this study is to analyze pre-service teachers' conceptions about confidence intervals. This research inquired about statistical reasoning from the perspective of students majoring in mathematics education enrolled in an undergraduate statistics education course who have previously completed an introductory course in statistics. We found common misconceptions among pre-service teachers participating in this study. An unanticipated finding is that all the pre-service teachers believed that the construction of a confidence interval relies on a sampling distribution that does not contain every possible sample. Instead, they believed it is necessary to take multiple samples and build a distribution of their means. I called this distribution the Multi-Sample Distribution (MSD).

Page generated in 0.1009 seconds