• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 581
  • 240
  • 59
  • 58
  • 28
  • 25
  • 24
  • 24
  • 20
  • 15
  • 15
  • 7
  • 3
  • 3
  • 3
  • Tagged with
  • 1282
  • 621
  • 315
  • 273
  • 197
  • 195
  • 193
  • 180
  • 172
  • 168
  • 151
  • 122
  • 122
  • 108
  • 106
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
231

Multiple imputation in the presence of a detection limit, with applications : an empirical approach / Shawn Carl Liebenberg

Liebenberg, Shawn Carl January 2014 (has links)
Scientists often encounter unobserved or missing measurements that are typically reported as less than a fixed detection limit. This especially occurs in the environmental sciences when detection of low exposures are not possible due to limitations of the measuring instrument, and the resulting data are often referred to as type I and II left censored data. Observations lying below this detection limit are therefore often ignored, or `guessed' because it cannot be measured accurately. However, reliable estimates of the population parameters are nevertheless required to perform statistical analysis. The problem of dealing with values below a detection limit becomes increasingly complex when a large number of observations are present below this limit. Researchers thus have interest in developing statistical robust estimation procedures for dealing with left- or right-censored data sets (SinghandNocerino2002). The aim of this study focuses on several main components regarding the problems mentioned above. The imputation of censored data below a fixed detection limit are studied, particularly using the maximum likelihood procedure of Cohen(1959), and several variants thereof, in combination with four new variations of the multiple imputation concept found in literature. Furthermore, the focus also falls strongly on estimating the density of the resulting imputed, `complete' data set by applying various kernel density estimators. It should be noted that bandwidth selection issues are not of importance in this study, and will be left for further research. In this study, however, the maximum likelihood estimation method of Cohen (1959) will be compared with several variant methods, to establish which of these maximum likelihood estimation procedures for censored data estimates the population parameters of three chosen Lognormal distribution, the most reliably in terms of well-known discrepancy measures. These methods will be implemented in combination with four new multiple imputation procedures, respectively, to assess which of these nonparametric methods are most effective with imputing the 12 censored values below the detection limit, with regards to the global discrepancy measures mentioned above. Several variations of the Parzen-Rosenblatt kernel density estimate will be fitted to the complete filled-in data sets, obtained from the previous methods, to establish which is the preferred data-driven method to estimate these densities. The primary focus of the current study will therefore be the performance of the four chosen multiple imputation methods, as well as the recommendation of methods and procedural combinations to deal with data in the presence of a detection limit. An extensive Monte Carlo simulation study was performed to compare the various methods and procedural combinations. Conclusions and recommendations regarding the best of these methods and combinations are made based on the study's results. / MSc (Statistics), North-West University, Potchefstroom Campus, 2014
232

Multiple imputation in the presence of a detection limit, with applications : an empirical approach / Shawn Carl Liebenberg

Liebenberg, Shawn Carl January 2014 (has links)
Scientists often encounter unobserved or missing measurements that are typically reported as less than a fixed detection limit. This especially occurs in the environmental sciences when detection of low exposures are not possible due to limitations of the measuring instrument, and the resulting data are often referred to as type I and II left censored data. Observations lying below this detection limit are therefore often ignored, or `guessed' because it cannot be measured accurately. However, reliable estimates of the population parameters are nevertheless required to perform statistical analysis. The problem of dealing with values below a detection limit becomes increasingly complex when a large number of observations are present below this limit. Researchers thus have interest in developing statistical robust estimation procedures for dealing with left- or right-censored data sets (SinghandNocerino2002). The aim of this study focuses on several main components regarding the problems mentioned above. The imputation of censored data below a fixed detection limit are studied, particularly using the maximum likelihood procedure of Cohen(1959), and several variants thereof, in combination with four new variations of the multiple imputation concept found in literature. Furthermore, the focus also falls strongly on estimating the density of the resulting imputed, `complete' data set by applying various kernel density estimators. It should be noted that bandwidth selection issues are not of importance in this study, and will be left for further research. In this study, however, the maximum likelihood estimation method of Cohen (1959) will be compared with several variant methods, to establish which of these maximum likelihood estimation procedures for censored data estimates the population parameters of three chosen Lognormal distribution, the most reliably in terms of well-known discrepancy measures. These methods will be implemented in combination with four new multiple imputation procedures, respectively, to assess which of these nonparametric methods are most effective with imputing the 12 censored values below the detection limit, with regards to the global discrepancy measures mentioned above. Several variations of the Parzen-Rosenblatt kernel density estimate will be fitted to the complete filled-in data sets, obtained from the previous methods, to establish which is the preferred data-driven method to estimate these densities. The primary focus of the current study will therefore be the performance of the four chosen multiple imputation methods, as well as the recommendation of methods and procedural combinations to deal with data in the presence of a detection limit. An extensive Monte Carlo simulation study was performed to compare the various methods and procedural combinations. Conclusions and recommendations regarding the best of these methods and combinations are made based on the study's results. / MSc (Statistics), North-West University, Potchefstroom Campus, 2014
233

The role of individual differences and involvement on attitudes toward animal welfare

Powell, Gwendolen Mair January 1900 (has links)
Master of Science / Department of Psychology / Richard J. Harris / Previous research has indicated that many factors influence the likelihood of using the central or peripheral routes of processing during exposure to a persuasive message, including involvement in the message. Previous research has generally focused on response involvement, which is based on outcome, while the focus of the present study is involvement based on personal investment. In the present study, 229 undergraduates were assessed on their trait empathy toward animals, and attitudes toward animals. They read a strong or weak persuasive message presented by either an attractive or less attractive writer. This design replicated previous findings by Bae (2008) on empathy and attitude change, and extended them by examining them experimentally, with a focus on issue-based involvement, which relies on moral or ego involvement. Participants were tested on several distinct DVs designed to indicate their change in attitude and behavior. Results varied for each DV, with source attractiveness predicting willingness to wear a button and display a bumper sticker, but with trait empathy predicting willingness to adopt a pet and vote to support a petition. The results imply that participants relied on different routes of processing depending on the DV, and that the role of emotion in issue involvement may inform advertisers in ways to effectively increase the likelihood of paying attention to a message.
234

A phylogeny and revised classification of Squamata, including 4161 species of lizards and snakes

Pyron, R., Burbrink, Frank, Wiens, John January 2013 (has links)
BACKGROUND:The extant squamates (>9400 known species of lizards and snakes) are one of the most diverse and conspicuous radiations of terrestrial vertebrates, but no studies have attempted to reconstruct a phylogeny for the group with large-scale taxon sampling. Such an estimate is invaluable for comparative evolutionary studies, and to address their classification. Here, we present the first large-scale phylogenetic estimate for Squamata.RESULTS:The estimated phylogeny contains 4161 species, representing all currently recognized families and subfamilies. The analysis is based on up to 12896 base pairs of sequence data per species (average = 2497 bp) from 12 genes, including seven nuclear loci (BDNF, c-mos, NT3, PDC, R35, RAG-1, and RAG-2), and five mitochondrial genes (12S, 16S, cytochrome b, ND2, and ND4). The tree provides important confirmation for recent estimates of higher-level squamate phylogeny based on molecular data (but with more limited taxon sampling), estimates that are very different from previous morphology-based hypotheses. The tree also includes many relationships that differ from previous molecular estimates and many that differ from traditional taxonomy.CONCLUSIONS:We present a new large-scale phylogeny of squamate reptiles that should be a valuable resource for future comparative studies. We also present a revised classification of squamates at the family and subfamily level to bring the taxonomy more in line with the new phylogenetic hypothesis. This classification includes new, resurrected, and modified subfamilies within gymnophthalmid and scincid lizards, and boid, colubrid, and lamprophiid snakes.
235

Quantifying the strength of evidence in forensic fingerprints

Forbes, Peter G. M. January 2014 (has links)
Part I presents a model for fingerprint matching using Bayesian alignment on unlabelled point sets. An efficient Monte Carlo algorithm is developed to calculate the marginal likelihood ratio between the hypothesis that an observed fingerprint and fingermark pair originate from the same finger and the hypothesis that they originate from different fingers. The model achieves good performance on the NIST-FBI fingerprint database of 258 matched fingerprint pairs, though the computed likelihood ratios are implausibly extreme due to oversimplification in our model. Part II moves to a more theoretical study of proper scoring rules. The chapters in this section are designed to be independent of each other. Chapter 9 uses proper scoring rules to calibrate the implausible likelihood ratios computed in Part I. Chapter 10 defines the class of compatible weighted proper scoring rules. Chapter 11 derives new results for the score matching estimator, which can quickly generate point estimates for a parametric model even when the normalization constant of the distribution is intractable. It is used to find an initial value for the iterative maximization procedure in §3.3. Appendix A describes a novel algorithm to efficiently sample from the posterior of a von Mises distribution. It is used within the fingerprint model sampling procedure described in §5.6. Appendix B includes various technical results which would otherwise disrupt the flow of the main dissertation.
236

Metrics and Test Procedures for Data Quality Estimation in the Aeronautical Telemetry Channel

Hill, Terry 10 1900 (has links)
ITC/USA 2015 Conference Proceedings / The Fifty-First Annual International Telemetering Conference and Technical Exhibition / October 26-29, 2015 / Bally's Hotel & Convention Center, Las Vegas, NV / There is great potential in using Best Source Selectors (BSS) to improve link availability in aeronautical telemetry applications. While the general notion that diverse data sources can be used to construct a consolidated stream of "better" data is well founded, there is no standardized means of determining the quality of the data streams being merged together. Absent this uniform quality data, the BSS has no analytically sound way of knowing which streams are better, or best. This problem is further exacerbated when one imagines that multiple vendors are developing data quality estimation schemes, with no standard definition of how to measure data quality. In this paper, we present measured performance for a specific Data Quality Metric (DQM) implementation, demonstrating that the signals present in the demodulator can be used to quickly and accurately measure the data quality, and we propose test methods for calibrating DQM over a wide variety of channel impairments. We also propose an efficient means of encapsulating this DQM information with the data, to simplify processing by the BSS. This work leads toward a potential standardization that would allow data quality estimators and best source selectors from multiple vendors to interoperate.
237

NON-COHERENTLY DETECTED FQPSK: RAPID SYNCHRONIZATION AND COMPATIBILITY WITH PCM/FM RECEIVERS

Park, Hyung Chul, Lee, Kwyro, Feher, Kamilo 10 1900 (has links)
International Telemetering Conference Proceedings / October 22-25, 2001 / Riviera Hotel and Convention Center, Las Vegas, Nevada / A new class of non-coherent detection techniques for recently standardized Feher patented quadrature phase-shift keying (FQPSK) systems is proposed and studied by computer aided design/simulations and also verified by experimental hardware measurements. The theoretical concepts of the described non-coherent techniques are based on an interpretation of the instantaneous frequency deviation or phase transition characteristics of FQPSK-B modulated signal at the front end of the receiver. These are accomplished either by Limiter-Discriminator (LD) or by Limiter-Discriminator followed by Integrate-and-Dump (LD I&D) methods. It is shown that significant BER performance improvements can be obtained by increasing the received signal’s observation time over multiple symbols as well as by adopting trellis-demodulation. For example, our simulation results show that a BER=10^-4 can be obtained for an E(b)/N(0)=12.7 dB.
238

Some Novel Statistical Inferences

Li, Chenxue 12 August 2016 (has links)
In medical diagnostic studies, the area under the Receiver Operating Characteristic (ROC) curve (AUC) and Youden index are two summary measures widely used in the evaluation of the diagnostic accuracy of a medical test with continuous test results. The first half of this dissertation will highlight ROC analysis including extension of Youden index to the partial Youden index as well as novel confidence interval estimation for AUC and Youden index in the presence of covariates in induced linear regression models. Extensive simulation results show that the proposed methods perform well with small to moderate sized samples. In addition, some real examples will be presented to illustrate the methods. The latter half focuses on the application of empirical likelihood method in economics and finance. Two models draw our attention. The first one is the predictive regression model with independent and identically distributed errors. Some uniform tests have been proposed in the literature without distinguishing whether the predicting variable is stationary or nearly integrated. Here, we extend the empirical likelihood methods in Zhu, Cai and Peng (2014) with independent errors to the case of an AR error process. The proposed new tests do not need to know whether the predicting variable is stationary or nearly integrated, and whether it has a finite variance or an infinite variance. Another model we considered is a GARCH(1,1) sequence or an AR(1) model with ARCH(1) errors. It is known that the observations have a heavy tail and the tail index is determined by an estimating equation. Therefore, one can estimate the tail index by solving the estimating equation with unknown parameters replaced by Quasi Maximum Likelihood Estimation (QMLE), and profile empirical likelihood method can be employed to effectively construct a confidence interval for the tail index. However, this requires that the errors of such a model have at least finite fourth moment to ensure asymptotic normality with n1/2 rate of convergence and Wilk's Theorem. We show that the finite fourth moment can be relaxed by employing some Least Absolute Deviations Estimate (LADE) instead of QMLE for the unknown parameters by noting that the estimating equation for determining the tail index is invariant to a scale transformation of the underlying model. Furthermore, the proposed tail index estimators have a normal limit with n1/2 rate of convergence under minimal moment condition, which may have an infinite fourth moment, and Wilk's theorem holds for the proposed profile empirical likelihood methods. Hence a confidence interval for the tail index can be obtained without estimating any additional quantities such as asymptotic variance.
239

EMPIRICAL LIKELIHOOD AND DIFFERENTIABLE FUNCTIONALS

Shen, Zhiyuan 01 January 2016 (has links)
Empirical likelihood (EL) is a recently developed nonparametric method of statistical inference. It has been shown by Owen (1988,1990) and many others that empirical likelihood ratio (ELR) method can be used to produce nice confidence intervals or regions. Owen (1988) shows that -2logELR converges to a chi-square distribution with one degree of freedom subject to a linear statistical functional in terms of distribution functions. However, a generalization of Owen's result to the right censored data setting is difficult since no explicit maximization can be obtained under constraint in terms of distribution functions. Pan and Zhou (2002), instead, study the EL with right censored data using a linear statistical functional constraint in terms of cumulative hazard functions. In this dissertation, we extend Owen's (1988) and Pan and Zhou's (2002) results subject to non-linear but Hadamard differentiable statistical functional constraints. In this purpose, a study of differentiable functional with respect to hazard functions is done. We also generalize our results to two sample problems. Stochastic process and martingale theories will be applied to prove the theorems. The confidence intervals based on EL method are compared with other available methods. Real data analysis and simulations are used to illustrate our proposed theorem with an application to the Gini's absolute mean difference.
240

Study of WW decay of a Higgs boson with the ALEPH and CMS detectors

Delaere, Christophe 06 July 2005 (has links)
The Standard Model is a mathematical description of the very nature of elementary particles and their interactions, now seen as relativistic quantum fields. A key feature of the theory is the Brout-Englert-Higgs mechanism, responsible for the spontaneous symmetry breaking of the underlying gauge symmetry, and which implies the existence of a neutral Higgs particle. Searches for the Higgs boson were conducted at the Large Electron Positron collider until 2000 and are still ongoing at the Tevatron collider, but the particle has not been not observed. In order to better constrain models with an exotic electroweak symmetry breaking sector, a search for a Higgs boson decaying into a W pair is carried out with the ALEPH detector on 453 pb-1 of data collected at center-of-mass energies up to 209 GeV. The analysis is optimized for the many topologies resulting from the six-fermion final state. A lower limit at 105.8 GeV/c² on the Higgs boson mass in a fermiophobic Higgs boson scenario is obtained. The ultimate machine for the Higgs boson discovery is the Large Hadron Collider, which is being built at CERN. In order to evaluate the physics potential of the CMS detector, the WH associated production of a Higgs boson decaying into a W pair is studied. Performances of data acquisition and its sophisticated trigger system, particle identification and event reconstruction are investigated by performing a detailed analysis on simulated data. Three-lepton final states are shown to provide interesting possibilities. For an integrated luminosity of 100 fb-1, a potential signal significance of more than 5ó is obtained in the mass interval between 155 and 178 GeV/c². The corresponding precision on the Higgs boson mass and partial decay width into W pairs are evaluated. This channel also provides one of the very few possible avenues towards the discovery of a fermiophobic Higgs boson below 180 GeV/c². These studies required many original technical developments, that are also presented.

Page generated in 0.0429 seconds