• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 11
  • 3
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 24
  • 24
  • 8
  • 5
  • 5
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

On maximum likelihood estimation and its relevance to time delay estimation

Kuo, Jen-Wei January 2010 (has links)
Typescript (photocopy). / Digitized by Kansas Correctional Industries
2

Exploring complex loss functions for point estimation

Chaisee, Kuntalee January 2015 (has links)
This thesis presents several aspects of simulation-based point estimation in the context of Bayesian decision theory. The first part of the thesis (Chapters 4 - 5) concerns the estimation-then-minimisation (ETM) method as an efficient computational approach to compute simulation-based Bayes estimates. We are interested in applying the ETM method to compute Bayes estimates under some non-standard loss functions. However, for some loss functions, the ETM method cannot be implemented straightforwardly. We examine the ETM method via Taylor approximations and cubic spline interpolations for Bayes estimates in one dimension. In two dimensions, we implement the ETM method via bicubic interpolation. The second part of the thesis (Chapter 6) concentrates on the analysis of a mixture posterior distribution with a known number of components using the Markov chain Monte Carlo (MCMC) output. We aim for Bayesian point estimation related to a label invariant loss function which allows us to estimate the parameters in the mixture posterior distribution without dealing with label switching. We also investigate uncertainty of the point estimates which is presented by the uncertainty bound and the crude uncertainty bound of the expected loss evaluated at the point estimates based on MCMC samples. The crude uncertainty bound is relatively cheap, but it seems to be unreliable. On the other hand, the uncertainty bound which is approximated a 95% confidence interval seems to be reliable, but are very computationally expensive. The third part of the thesis (Chapter 7), we propose a possible alternative way to present the uncertainty for Bayesian point estimates. We adopt the idea of leaving out observations from the jackknife method to compute jackknife-Bayes estimates. We then use the jackknife-Bayes estimates to visualise the uncertainty of Bayes estimates. Further investigation is required to improve the method and some suggestions are made to maximise the efficiency of this approach.
3

Estimation theory of selective reporting, ordered parameters, and selection /

Dhariyal, Ishwari Dutt January 1977 (has links)
No description available.
4

Reconceptualizing divided government

Svensen, Eric Paul 02 July 2014 (has links)
In this dissertation, I explain why scholars are unable to conclusively find evidence that divided government is the main determinant of legislative gridlock. I argue this unsettled debate is largely attributable to an imprecise conceptual view of inter-branch tensions, and that these conceptual limitations are exacerbated by unrefined measurement practices. I argue refined measures such as party polarization and gridlock intervals better explain institutional behavior than divided government. Using unique datasets estimating legislator preferences on domestic and foreign policy, findings show that when compared to more refined measures, split-party government is not the sole or even the most important source of partisan conflict. In addition, compared to other studies on divided government, I argue the reason the distinction between unified and divided government is often blurred is that a number of underlying political and institutional pressures make sweeping policy change difficult even for most unified governments. These factors contribute to the public’s growing dissatisfaction with government’s inability to solve many economic and social problems. / text
5

Estimation methods in adaptive treatment-selection designs

Pickard, Michael 08 April 2016 (has links)
Adaptive designs can improve the efficiency of drug development, but further research is needed before some are more widely implemented. One such design is a treatment-selection design, which begins with k treatment arms, but only a subset is carried forward after an interim analysis. The final analysis of the selected arm(s) is then performed using the data from both stages of the study. One issue with this design is ensuring the Type I error rate is controlled, but there have been a number of proposals that largely address this. A second drawback that has not yet been fully addressed is that the maximum likelihood estimate of the selected arm at the final analysis is often biased upward due to the selection method. Unbiased estimators already exist for this design, but methods with an acceptable balance between bias and mean squared error (MSE) are lacking. In this dissertation, two estimation approaches are proposed. The first is a parametric bootstrap resampling method in which the level of bias adjustment applied is driven by a comparison of the observed results to those expected when all arms have equal true means. The second approach is an empirical Bayes estimator that implements a novel limited translation function. These methods are compared to previously proposed approaches with respect to bias and MSE for studies that have either a normal or binomial endpoint. Both proposed methods are shown to exhibit reduced bias with reasonable MSE in some simulated scenarios, but the resampling method consistently shows similar, or improved, performance compared to previous approaches across the examined scenarios. The utility of this resampling method is further demonstrated by showing that it can be implemented when the arm with the second largest mean is selected for stage 2. It is also shown that the resampling method can be extended to when more than one arm is selected in stage 1, when there is a futility analysis, or when the study has a time-to-event endpoint. Recommendations on confidence intervals are also provided. The results demonstrate that the parametric bootstrap resampling method is a viable estimation approach for treatment-selection designs.
6

Single and Twin-Heaps as Natural Data Structures for Percentile Point Simulation Algorithms

Hatzinger, Reinhold, Panny, Wolfgang January 1993 (has links) (PDF)
Sometimes percentile points cannot be determined analytically. In such cases one has to resort to Monte Carlo techniques. In order to provide reliable and accurate results it is usually necessary to generate rather large samples. Thus the proper organization of the relevant data is of crucial importance. In this paper we investigate the appropriateness of heap-based data structures for the percentile point estimation problem. Theoretical considerations and empirical results give evidence of the good performance of these structures regarding their time and space complexity. (author's abstract) / Series: Forschungsberichte / Institut für Statistik
7

Inferential Methods for the Tetrachoric Correlation Coefficient

Bonett, Douglas G., Price, Robert M. 01 January 2005 (has links)
The tetrachoric correlation describes the linear relation between two continuous variables that have each been measured on a dichotomous scale. The treatment of the point estimate, standard error, interval estimate, and sample size requirement for the tetrachoric correlation is cursory and incomplete in modern psychometric and behavioral statistics texts. A new and simple method of accurately approximating the tetrachoric correlation is introduced. The tetrachoric approximation is then used to derive a simple standard error, confidence interval, and sample size planning formula. The new confidence interval is shown to perform far better than the confidence interval computed by SAS. A method to improve the SAS confidence interval is proposed. All of the new results are computationally simple and are ideally suited for textbook and classroom presentations.
8

A differential equation for a class of discrete lifetime distributions with an application in reliability: A demonstration of the utility of computer algebra

Csenki, Attila 13 October 2013 (has links)
Yes / It is shown that the probability generating function of a lifetime random variable T on a finite lattice with polynomial failure rate satisfies a certain differential equation. The interrelationship with Markov chain theory is highlighted. The differential equation gives rise to a system of differential equations which, when inverted, can be used in the limit to express the polynomial coefficients in terms of the factorial moments of T. This then can be used to estimate the polynomial coefficients. Some special cases are worked through symbolically using Computer Algebra. A simulation study is used to validate the approach and to explore its potential in the reliability context.
9

Harnessing the Power of Self-Training for Gaze Point Estimation in Dual Camera Transportation Datasets

Bhagat, Hirva Alpesh 14 June 2023 (has links)
This thesis proposes a novel approach for efficiently estimating gaze points in dual camera transportation datasets. Traditional methods for gaze point estimation are dependent on large amounts of labeled data, which can be both expensive and time-consuming to collect. Additionally, alignment and calibration of the two camera views present significant challenges. To overcome these limitations, this thesis investigates the use of self-learning techniques such as semi-supervised learning and self-training, which can reduce the need for labeled data while maintaining high accuracy. The proposed method is evaluated on the DGAZE dataset and achieves a 57.2\% improvement in performance compared to the previous methods. This approach can prove to be a valuable tool for studying visual attention in transportation research, leading to more cost-effective and efficient research in this field. / Master of Science / This thesis presents a new method for efficiently estimating the gaze point of drivers while driving, which is crucial for understanding driver behavior and improving transportation safety. Traditional methods require a lot of labeled data, which can be time-consuming and expensive to obtain. This thesis proposes a self-learning approach that can learn from both labeled and unlabeled data, reducing the need for labeled data while maintaining high accuracy. By training the model on labeled data and using its own estimations on unlabeled data to improve its performance, the proposed approach can adapt to new scenarios and improve its accuracy over time. The proposed method is evaluated on the DGAZE dataset and achieves a 57.2\% improvement in performance compared to the previous methods. Overall, this approach offers a more efficient and cost-effective solution that can potentially help improve transportation safety by providing a better understanding of driver behavior. This approach can prove to be a valuable tool for studying visual attention in transportation research, leading to more cost-effective and efficient research in this field.
10

Dynamic HIV/AIDS parameter estimation with Applications

Filter, Ruben Arnold 13 June 2005 (has links)
This dissertation is primarily concerned with dynamic HIV/AIDS parameter estimation, set against the background of engineering, biology and medical science. The marriage of these seemingly divergent fields creates a dynamic research environment that is the source of many novel results and practical applications for people living with HIV/AIDS. A method is presented to extract model parameters for the three-dimensional HIV/AIDS model in situations where an orthodox LSQ method would fail. This method allows information from outside the dataset to be added to the cost functional so that parameters can be estimated even from sparse data. Estimates in literature were for at most two parameters per dataset, whereas the procedures described herein can estimate all six parameters. A standard table for data acquisition in hospitals and clinics is analyzed to show that the table would contain enough information to extract a suitable parameter estimate for the model. Comparison with a published experiment validates the method, and shows that it becomes increasingly hard to coordinate assumptions and implicit information when analyzing real data. Parameter variations during the course of HIV/AIDS are not well understood. The results show that parameters vary over time. The analysis of parameter variation is augmented with a novel two-stage approach of model identification for the six-dimensional model. In this context, the higher-dimensional models allow an explanation for the onset of AIDS from HIV without any variation in the model parameters. The developed estimation procedure was successfully used to analyze the data from forty four patients of Southern Africa in the HIVNET 28 vaccine readiness trial. The results are important to form a benchmark for the study of vaccination. The results show that after approximately 17 months from seroconversion, oscillations in viremia flattened to a log10 based median set point of 4:08, appearing no different from reported studies in subtype B HIV-1 infected male cohorts. Together with these main outcomes, an analysis of confidence intervals for set point, days to set point and the individual parameters is presented. When estimates for the HIVNET 28 cohort are combined, the data allows a meaningful first estimate of parameters of the three-dimensional HIV/AIDS model for patients from southern Africa. The theoretical basis is used to develop an application that allows medical practitioners to estimate the three-dimensional model parameters for HIV/AIDS patients. The program demands little background knowledge from the user, but for practitioners with experience in mathematical modeling, there is ample opportunity to fine-tune the procedures for special needs. / Dissertation (MEng)--University of Pretoria, 2006. / Electrical, Electronic and Computer Engineering / Unrestricted

Page generated in 0.1297 seconds