• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 56
  • 8
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 66
  • 66
  • 20
  • 16
  • 11
  • 8
  • 8
  • 8
  • 7
  • 7
  • 5
  • 5
  • 5
  • 5
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

A comparison of estimators in hierarchical linear modeling restricted maximum likelihood versus bootstrap via minimum norm quadratic unbiased estimators /

Delpish, Ayesha Nneka. Niu, Xu-Feng. January 2006 (has links)
Thesis (Ph. D.)--Florida State University, 2006. / Advisor: Xu-Feng Niu, Florida State University, College of Arts and Sciences, Dept. of Statistics. Title and description from dissertation home page (viewed Sept. 18, 2006). Document formatted into pages; contains ix, 116 pages. Includes bibliographical references.
52

Estimating the slope in the simple linear errors-in-variables model

Musekiwa, Alfred. 13 August 2012 (has links)
M.Sc. / In this study we consider the problem ofestiniating the slope in the simple linear errors-in-variables model. There are two different types of relationship that can he specified in the errors-in-variables model: one that specifies a functional linear relationship and one describing a structural linear relationship. The different relationship specifications can lead to different estimators with different properties. These two specifications are highlighted in this study. A least squares solution (to the estimation of the slope) is given. The problem of finding the maximum likelihood solution to these two specifications is addressed. It is noted that an unidentifiability problem arises in this attempt. The solution is seen to lie in making assumptions on the error variances. Interval estimation for the slope parameter is discussed. It is noted that any interval estimator of the slope whose length is always finite will have a confidence coefficient of zero. Various interval estimation methods are reviewed but emphasis is mainly on the investigation of a bootstrap procedure for estimating the confidence interval for the slope parameter β. More specifically, the Linder and Babu (1994) (bootstrap) method for the structural relationship model with known variance ratio is investigated here. The error distributions were assumed normal. A simulation study based on this paper is carried out. The results in the simulation study show that this bootstrap procedure performs well in comparison with the normal theory estimates for normally distributed data, that is, it has better coverage accuracy than the normal approximation.
53

Bootstrap standard error and confidence intervals for the correlation corrected for indirect range restriction: a Monte Carlo study. / Bootstrap method / Bootstrap standard error & confidence intervals for the correlation corrected for indirect range restriction

January 2006 (has links)
Li Johnson Ching Hong. / Thesis (M.Phil.)--Chinese University of Hong Kong, 2006. / Includes bibliographical references (leaves 40-42). / Abstracts in English and Chinese. / ACKNOWLEDGEMENT --- p.2 / ABSTRACT --- p.3 / CHINESE ABSTRACT --- p.4 / TABLE OF CONTENTS --- p.5 / Chapter CHAPTER 1 --- INTRODUCTION --- p.7 / Thorndike's Three Formulae to Correct Correlation for Range Restriction --- p.8 / Significance of Case 3 --- p.9 / Importance of Standard Error and Confidence Intervals --- p.10 / Research Gap in the Estimation of Standard Error of rc and the Construction of the Confidence Intervals for pxy --- p.10 / Objectives of the Present Study --- p.12 / Chapter CHAPTER 2 --- BOOTSTRAP METHOD --- p.13 / Different Confidence Intervals Constructed for the Present Study --- p.14 / Chapter CHAPTER 3 --- A PROPOSED PROCEDURE FOR THE ESTIMATION OF STANDAR ERROR OF rc AND THE CONSTRUCTION OF CONFIDENCE INTERVALS --- p.16 / Chapter CHAPTER 4 --- METHODS --- p.20 / Model Specifications --- p.20 / Procedure --- p.21 / Chapter CHAPTER 5 --- ASSESSMENT --- p.23 / Chapter CHAPTER 6 --- RESULTS --- p.25 / Accuracy of Average Correlation Corrected for IRR ( rc ) --- p.25 / Empirical Standard Deviation (SDE) of rc --- p.29 / Accuracy of Standard Error Estimate --- p.29 / Accuracy of Confidence Intervals --- p.33 / Chapter CHAPTER 7 --- DISCUSSION AND CONCLUSION --- p.36 / Chapter CHAPTER 8 --- LIMITATIONS AND FURTHER DIRECTIONS --- p.38 / REFERENCES --- p.40 / APPENDIX A --- p.43 / FIGURE CAPTION --- p.53 / LIST OF TABLES --- p.55
54

Threshold autoregressive model with multiple threshold variables.

January 2005 (has links)
Chen Haiqiang. / Thesis (M.Phil.)--Chinese University of Hong Kong, 2005. / Includes bibliographical references (leaves 33-35). / Abstracts in English and Chinese. / Chapter 1. --- Introduction --- p.1 / Chapter 2. --- The Model --- p.4 / Chapter 3. --- Least Squares Estimations --- p.6 / Chapter 4. --- Inference --- p.7 / Chapter 4.1 --- Asymptotic Joint Distribution of the Threshold Estimators --- p.7 / Chapter 4.2 --- Testing Threshold Effect: Model Selection Followed by Testing --- p.13 / Chapter 5. --- Modeling --- p.16 / Chapter 5.1 --- Generic Consistency of the Threshold Estimators under specification errors --- p.17 / Chapter 5.2 --- Modeling Procedure --- p.20 / Chapter 6. --- Monte Carlo Simulations --- p.21 / Chapter 7. --- Empirical Application in the Financial Market --- p.24 / Chapter 7.1 --- Data Description --- p.26 / Chapter 7.2 --- Estimated Results --- p.26 / Chapter 8. --- Conclusion --- p.30 / References --- p.33 / Appendix 1: Proof of theorem1 --- p.36 / Appendix 2: Proof of theorem2 --- p.39 / Appendix 3: Proof of theorem3 --- p.43 / List of Graph --- p.49
55

Next Generation Ultrashort-Pulse Retrieval Algorithm for Frequency-Resolved Optical Gating: The Inclusion of Random (Noise) and Nonrandom (Spatio-Temporal Pulse Distortions) Error

Wang, Ziyang 14 April 2005 (has links)
A new pulse-retrieval software for Frequency-Resolved Optical Gating (FROG) technique has been developed. The new software extends the capacity of the original FROG algorithm in two major categories. First is a new method to determine the uncertainty of the retrieved pulse field in FROG technique. I proposed a simple, robust, and general technique?tstrap method?ch places error bars on the intensity and phase of the retrieved pulse field. The bootstrap method was also extended to automatically detect ambiguities in the FROG pulse retrieval. The second improvement deals with the spatiotemporal effect of the input laser beam on the measured GRENOUILLE trace. I developed a new algorithm to retrieve the pulse information, which includes both pulse temporal field and the spatiotemporal parameters, from the spatiotemporal distorted GRENOUILLE trace. It is now possible to have a more complete view of an ultrashort pulse. I also proposed a simple method to remove the spatial profile influence of the input laser beam on the GRENOUILLE trace. The new method extends the capacity of GRENOUILLE technique to measure the beams with irregular spatial profiles.
56

Modelling and resampling based multiple testing with applications to genetics

Huang, Yifan. January 2005 (has links)
Thesis (Ph. D.)--Ohio State University, 2005. / Title from first page of PDF file. Document formatted into pages; contains xii, 97 p.; also includes graphics. Includes bibliographical references (p. 94-97). Available online via OhioLINK's ETD Center
57

Intervalos de previsão bootstrap em modelos de volatilidade univariados / Bootstrap prediction in univariate volatility models

Trucíos Maza, Carlos César, 1985- 07 November 2012 (has links)
Orientador: Luiz Koodi Hotta / Dissertação (mestrado) - Universidade Estadual de Campinas, Instituto de Matemática, Estatística e Computação Científica / Made available in DSpace on 2018-08-20T22:42:22Z (GMT). No. of bitstreams: 1 TruciosMaza_CarlosCesar_M.pdf: 13820849 bytes, checksum: 0cc000af0d7cb7cb6ee6c05ef9f3afbd (MD5) Previous issue date: 2012 / Resumo: Mercados financeiros têm mostrado um grande interesse em intervalos de previsão como uma medida de incerteza. Além das previsões do nível, a previsão da volatilidade é importante em várias aplicações em finanças. O modelo GARCH tem sido bastante utilizado na modelagem da volatilidade. A partir deste modelo, outros modelos foram propostos para incorporar outros fatos estilizados, como o efeito de alavancagem. Neste sentido, temos os modelos EGARCH e GJR-GARCH. Os métodos tradicionais de construção de intervalos de previsão para séries temporais geralmente assumem que os parâmetros do modelo são conhecidos e os erros normais. Quando estas suposições não são verdadeiras, o que costuma acontecer na prática, o intervalo de previsão obtido tenderá a ter uma cobertura abaixo da nominal. Nesta dissertação propomos uma adaptação do algoritmo PRR (Pascual, Romo e Ruiz) desenvolvido para obter intervalos de previsão em modelos GARCH, para obter intervalos de previsão em modelos EGARCH e GJR-GARCH. As adaptações feitas são analisadas através de experimentos Monte Carlo e verifica-se que tiveram bom desempenho apresentando valores da cobertura estimada próximos da cobertura nominal. As adaptações propostas assim como o algoritmo PRR são aplicadas para obter intervalos de previsão dos retornos e das volatilidades para a série de retornos da Ibovespa e para a série NYSE COMPOSITE(DJ) da bolsa de valores de Nova Iorque, obtendo em ambos os casos resultados satisfatórios / Abstract: Financial Markets have shown a big interest in forecast intervals (prediction intervals) as a uncertain measure. Besides the level prediction, the prediction of the volatility is very important in many financial applications. The GARCH model, has been very used in volatility modeling. From this model, other have been proposed to incorporate other stylized facts, such as the leverage effect. In this sense, we have the EGARCH and GJR-GARCH models. Traditional methods for constructing predictions intervals for time series generally assume that the model parameters are known and the erros are normal. When these assumptions are not true, that it is very often in practice, the obtained prediction interval, will tend to have a cover under the nominal. In this theses we propose an adaptation of the PRR (Pascual, Romo and Ruiz) algorithm developed to obtain prediction intervals in GARCH models, to obtain prediction intervals in EGARCH and GJR-GARCH models. These adaptations are analized through Monte-Carlo experiments and It was verified that they have a good performance showing estimated cover values close to the nominal cover. The proposed adaptations, such as the PRR algorithm are applied to obtain prediction intervals from the returns and volatilities for the Ibovespa return series and for the New York Stock Exchange NYSE COMPOSITE(DJ) series, obtaining satisfactory results in both cases / Mestrado / Estatistica / Mestre em Estatística
58

Evaluation of bootstrapping as a validation technique for population pharmacokinetic models

Al-otoum, Mohammed Fawzi 01 January 2004 (has links)
It has been recommended by the FDA and others that the population pharmacokinetic models (PPKM) need to be validated. This is particularly true when the model plays a key role in the construction of dosing strategies. It was the objective of the current study to evaluate the ability of bootstrapping to identify PPKMs that were estimated from data without influence observations versus PPKMs from data containing influence observations. The evaluation was performed in four phases. In phase I, ten no-influence index datasets and ten influence index datasets were created. A model parameter (theta !) was estimated for the index datasets. It was found that influence observations caused an over-estimation of the model parameter. In phase II, 200 bootstrap datasets were resampled with replacement from each of the twenty index datasets ( 4000 datasets total). In phase III, the bootstrapping validation method was executed using NONMEM for model estimation and the resulting statistics were used to detect models developed from influence data. The metrics of choice were mean absolute prediction error (MAPE) and mean squared prediction error (MSPE). In phase IV, the impact of achieving a global minimum in the NONMEM program on the non-parametric bootstrap validation process was investigated. This study showed that the current and widely followed procedure for application of the bootstrap for PPK model validation has significant deficiencies. The achievement of a global minimum in the NONMEM program proved to be an important and pivotal factor when applying bootstrapping to the PPKM validation process. Therefore, we concluded that each bootstrap dataset should be evaluated with several model control streams. Further, the suggested value for an acceptable difference between the NONMEM minimum objective function values for a global and a near global minimum should be 2.5 units.
59

NMR Studies of the GCN4 Transcription Factor and Hox DNA Consensus Sequences

Crawley, Timothy January 2023 (has links)
The conversion of genetic information into functional RNA and protein is of fundamental importance to all known life forms. In cellular organisms, this hinges on the interaction of double stranded DNA and the transcription factor class of proteins. Substantial progress in the fields of biochemistry and genomics have made the identification of transcription factor binding sites and the resultant change in transcriptional output relatively routine. However, fully understanding this central life process requires knowing not only where transcription factors bind DNA, but why and how. These questions are approached here using solution state NMR spectroscopy and the statistical technique of bootstrap aggregation in order to: i) glean biologically relevant insights into the dynamics of the GCN4 transcription factor from NMR relaxation experiments; ii) examine the influence of electrostatics on the structure of GCN4 in the absence of DNA; iii) analyze the conformational state of several Hox transcription factor DNA binding sites. NMR spectroscopy capitalizes on connections between electromagnetism and the quantum mechanical property of nuclear spin angular momentum to study the structure of molecules. Application of NMR relaxation experiments provides further information on molecular structure and dynamics. When performed in solution, the data generated by this technique occurs in conditions more similar to those found within a cell than other approaches used in structural biology. However, the biological relevance of any insights derived from solution state NMR relaxation experiments depends on the application of an appropriate model for nuclear spin relaxation. Typically, this involves applying a statistical test to select the best model from among several candidates in the model-free formalism. Chapter 3 uses 15N relaxation data collected on the basic leucine zipper (bZip) domain of the GCN4 transcription factor to detail the potential problems and model selection errors that arise from this approach, and presents the alternative method of bootstrap aggregation. Applying this statistical technique allowed for the generation of multimodel inferences about the internal motions and rigidity of the basic region of GCN4, enhancing the likelihood of their biological relevance. The results presented in Chapter 3 further confirmed the presence of nascent helices in the generally disordered basic region of the GCN4 bZip domain. Interestingly, when complexed with appropriate DNA substrate, this region assumes a fully α-helical conformation. A long standing hypothesis assumes the inability of the basic region to form an α-helix in the absence of DNA arises, in part, due to repulsion between its charged amino acids. This hypothesis is tested in Chapter 4 using NMR relaxation experiments performed in solutions containing either increased or decreased concentrations of salt. Surprisingly, screening the electrostatic repulsion between charged residues using higher levels of salt had no discernible effect on the structure or dynamics of the basic region. Chapter 5 examines the other side of the interaction between DNA and transcription factors. Here, previous work performed with the Hox family of transcription factors indicated the conformational state of DNA has an important role in enhancing the specificity with which Hox proteins bind certain sequences. In particular, the geometry of the DNA minor groove strongly influences the recruitment of appropriate Hox transcription factors. This relationship is examined using solution state NMR to study four Hox DNA binding sequences. The binding affinity between each of these sequences and the Hox protein AbdB was previously shown to correlate with the native unbound state of the DNA. The two sequences predicted to have native minor groove widths similar to those of the bound DNA had higher affinity for AbdB than those that deformed upon binding. Though mixed, the results of NMR experiments generally support the predicted structures, particularly for the high affinity sequences, indicating a single pronounced narrowing of the minor groove. Taken together, the results presented here illustrate the complex interactions underpinning the appropriate binding of DNA and transcription factors. It further highlights the need to study the structure and dynamics of both DNA and protein, as well as that of the bound complex, in order to fully understand how and why specific sequences are bound in response to stimuli.
60

Statistically Efficient Methods for Computation-Aware Uncertainty Quantification and Rare-Event Optimization

He, Shengyi January 2024 (has links)
The thesis covers two fundamental topics that are important across the disciplines of operations research, statistics and even more broadly, namely stochastic optimization and uncertainty quantification, with the common theme to address both statistical accuracy and computational constraints. Here, statistical accuracy encompasses the precision of estimated solutions in stochastic optimization, as well as the tightness or reliability of confidence intervals. Computational concerns arise from rare events or expensive models, necessitating efficient sampling methods or computation procedures. In the first half of this thesis, we study stochastic optimization that involves rare events, which arises in various contexts including risk-averse decision-making and training of machine learning models. Because of the presence of rare events, crude Monte Carlo methods can be prohibitively inefficient, as it takes a sample size reciprocal to the rare-event probability to obtain valid statistical information about the rare-event. To address this issue, we investigate the use of importance sampling (IS) to reduce the required sample size. IS is commonly used to handle rare events, and the idea is to sample from an alternative distribution that hits the rare event more frequently and adjusts the estimator with a likelihood ratio to retain unbiasedness. While IS has been long studied, most of its literature focuses on estimation problems and methodologies to obtain good IS in these contexts. Contrary to these studies, the first half of this thesis provides a systematic study on the efficient use of IS in stochastic optimization. In Chapter 2, we propose an adaptive procedure that converts an efficient IS for gradient estimation to an efficient IS procedure for stochastic optimization. Then, in Chapter 3, we provide an efficient IS for gradient estimation, which serves as the input for the procedure in Chapter 2. In the second half of this thesis, we study uncertainty quantification in the sense of constructing a confidence interval (CI) for target model quantities or prediction. We are interested in the setting of expensive black-box models, which means that we are confined to using a low number of model runs, and we also lack the ability to obtain auxiliary model information such as gradients. In this case, a classical method is batching, which divides data into a few batches and then constructs a CI based on the batched estimates. Another method is the recently proposed cheap bootstrap that is constructed on a few resamples in a similar manner as batching. These methods could save computation since they do not need an accurate variability estimator which requires sufficient model evaluations to obtain. Instead, they cancel out the variability when constructing pivotal statistics, and thus obtain asymptotically valid t-distribution-based CIs with only few batches or resamples. The second half of this thesis studies several theoretical aspects of these computation-aware CI construction methods. In Chapter 4, we study the statistical optimality on CI tightness among various computation-aware CIs. Then, in Chapter 5, we study the higher-order coverage errors of batching methods. Finally, Chapter 6 is a related investigation on the higher-order coverage and correction of distributionally robust optimization (DRO) as another CI construction tool, which assumes an amount of analytical information on the model but bears similarity to Chapter 5 in terms of analysis techniques.

Page generated in 0.0672 seconds