• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 298
  • 107
  • 49
  • 38
  • 23
  • 20
  • 20
  • 18
  • 9
  • 8
  • 7
  • 6
  • 5
  • 4
  • 4
  • Tagged with
  • 689
  • 152
  • 84
  • 77
  • 71
  • 66
  • 55
  • 54
  • 49
  • 48
  • 46
  • 43
  • 43
  • 42
  • 40
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
311

Estimation and Inference for Quantile Regression of Longitudinal Data : With Applications in Biostatistics

Karlsson, Andreas January 2006 (has links)
This thesis consists of four papers dealing with estimation and inference for quantile regression of longitudinal data, with an emphasis on nonlinear models. The first paper extends the idea of quantile regression estimation from the case of cross-sectional data with independent errors to the case of linear or nonlinear longitudinal data with dependent errors, using a weighted estimator. The performance of different weights is evaluated, and a comparison is also made with the corresponding mean regression estimator using the same weights. The second paper examines the use of bootstrapping for bias correction and calculations of confidence intervals for parameters of the quantile regression estimator when longitudinal data are used. Different weights, bootstrap methods, and confidence interval methods are used. The third paper is devoted to evaluating bootstrap methods for constructing hypothesis tests for parameters of the quantile regression estimator using longitudinal data. The focus is on testing the equality between two groups of one or all of the parameters in a regression model for some quantile using single or joint restrictions. The tests are evaluated regarding both their significance level and their power. The fourth paper analyzes seven longitudinal data sets from different parts of the biostatistics area by quantile regression methods in order to demonstrate how new insights can emerge on the properties of longitudinal data from using quantile regression methods. The quantile regression estimates are also compared and contrasted with the least squares mean regression estimates for the same data set. In addition to looking at the estimates, confidence intervals and hypothesis testing procedures are examined.
312

Data Driven Approaches to Testing Homogeneity of Intraclass Correlation Coefficients

Wu, Baohua 01 December 2010 (has links)
The test of homogeneity for intraclass correlation coefficients has been one of the active topics in statistical research. Several chi-square tests have been proposed to test the homogeneity of intraclass correlations in the past few decades. The big concern for them is that these methods are seriously biased when sample sizes are not large. In this thesis, data driven approaches are proposed to testing the homogeneity of intraclass correlation coefficients of several populations. Through simulation study, data driven methods have been proved to be less biased and accurate than some commonly used chi-square tests.
313

Bootstrap and Empirical Likelihood-based Semi-parametric Inference for the Difference between Two Partial AUCs

Huang, Xin 17 July 2008 (has links)
With new tests being developed and marketed, the comparison of the diagnostic accuracy of two continuous-scale diagnostic tests are of great importance. Comparing the partial areas under the receiver operating characteristic curves (pAUC) is an effective method to evaluate the accuracy of two diagnostic tests. In this thesis, we study the semi-parametric inference for the difference between two pAUCs. A normal approximation for the distribution of the difference between two pAUCs has been derived. The empirical likelihood ratio for the difference between two pAUCs is defined and its asymptotic distribution is shown to be a scaled chi-quare distribution. Bootstrap and empirical likelihood based inferential methods for the difference are proposed. We construct five confidence intervals for the difference between two pAUCs. Simulation studies are conducted to compare the finite sample performance of these intervals. We also use a real example as an application of our recommended intervals.
314

Interval Estimation for the Correlation Coefficient

Jung, Aekyung 11 August 2011 (has links)
The correlation coefficient (CC) is a standard measure of the linear association between two random variables. The CC plays a significant role in many quantitative researches. In a bivariate normal distribution, there are many types of interval estimation for CC, such as z-transformation and maximum likelihood estimation based methods. However, when the underlying bivariate distribution is unknown, the construction of confidence intervals for the CC is still not well-developed. In this thesis, we discuss various interval estimation methods for the CC. We propose a generalized confidence interval and three empirical likelihood-based non-parametric intervals for the CC. We also conduct extensive simulation studies to compare the new intervals with existing intervals in terms of coverage probability and interval length. Finally, two real examples are used to demonstrate the application of the proposed methods.
315

Statistical Evaluation of Continuous-Scale Diagnostic Tests with Missing Data

Wang, Binhuan 12 June 2012 (has links)
The receiver operating characteristic (ROC) curve methodology is the statistical methodology for assessment of the accuracy of diagnostics tests or bio-markers. Currently most widely used statistical methods for the inferences of ROC curves are complete-data based parametric, semi-parametric or nonparametric methods. However, these methods cannot be used in diagnostic applications with missing data. In practical situations, missing diagnostic data occur more commonly due to various reasons such as medical tests being too expensive, too time consuming or too invasive. This dissertation aims to develop new nonparametric statistical methods for evaluating the accuracy of diagnostic tests or biomarkers in the presence of missing data. Specifically, novel nonparametric statistical methods will be developed with different types of missing data for (i) the inference of the area under the ROC curve (AUC, which is a summary index for the diagnostic accuracy of the test) and (ii) the joint inference of the sensitivity and the specificity of a continuous-scale diagnostic test. In this dissertation, we will provide a general framework that combines the empirical likelihood and general estimation equations with nuisance parameters for the joint inferences of sensitivity and specificity with missing diagnostic data. The proposed methods will have sound theoretical properties. The theoretical development is challenging because the proposed profile log-empirical likelihood ratio statistics are not the standard sum of independent random variables. The new methods have the power of likelihood based approaches and jackknife method in ROC studies. Therefore, they are expected to be more robust, more accurate and less computationally intensive than existing methods in the evaluation of competing diagnostic tests.
316

Essays in Efficiency Analysis

Demchuk, Pavlo 16 September 2013 (has links)
Today a standard procedure to analyze the impact of environmental factors on productive efficiency of a decision making unit is to use a two stage approach, where first one estimates the efficiency and then uses regression techniques to explain the variation of efficiency between different units. It is argued that the abovementioned method may produce doubtful results which may distort the truth data represents. In order to introduce economic intuition and to mitigate the problem of omitted variables we introduce the matching procedure which is to be used before the efficiency analysis. We believe that by having comparable decision making units we implicitly control for the environmental factors at the same time cleaning the sample of outliers. The main goal of the first part of the thesis is to compare a procedure including matching prior to efficiency analysis with straightforward two stage procedure without matching as well as an alternative of conditional efficiency frontier. We conduct our study using a Monte Carlo study with different model specifications and despite the reduced sample which may create some complications in the computational stage we strongly agree with a notion of economic meaningfulness of the newly obtained results. We also compare the results obtained by the new method with ones previously produced by Demchuk and Zelenyuk (2009) who compare efficiencies of Ukrainian regions and find some differences between the two approaches. Second part deals with an empirical study of electricity generating power plants before and after market reform in Texas. We compare private, public and municipal power generators using the method introduced in part one. We find that municipal power plants operate mostly inefficiently, while private and public are very close in their production patterns. The new method allows us to compare decision making units from different groups, which may have different objective schemes and productive incentives. Despite the fact that at a certain point after the reform private generators opted not to provide their data to the regulator we were able to construct tree different data samples comprising two and three groups of generators and analyze their production/efficiency patterns. In the third chapter we propose a semiparametric approach with shape constrains which is consistent with monotonicity and concavity constraints. Penalized splines are used to maintain the shape constrained via nonlinear transformations of spline basis expansions. The large sample properties, an effective algorithm and method of smoothing parameter selection are presented in the paper. Monte Carlo simulations and empirical examples demonstrate the finite sample performance and the usefulness of the proposed method.
317

Design and Implementation of a high-efficiency low-power analog-to-digital converter for high-speed transceivers

Younis, Choudhry Jabbar January 2012 (has links)
Modern communication systems require higher data rates which have increased thedemand for high speed transceivers. For a system to work efficiently, all blocks ofthat system should be fast. It can be seen that analog interfaces are the main bottleneckin whole system in terms of speed and power. This fact has led researchersto develop high speed analog to digital converters (ADCs) with low power consumption.Among all the ADCs, flash ADC is the best choice for faster data conversion becauseof its parallel structure. This thesis work describes the design of such a highspeed and low power flash ADC for analog front end (AFE) of a transceiver. Ahigh speed highly linear track and hold (TnH) circuit is needed in front of ADCwhich gives a stable signal at the input of ADC for accurate conversion. Twodifferent track and hold architectures are implemented, one is bootstrap TnH andother is switched source follower TnH. Simulations show that high speed with highlinearity can be achieved from bootstrap TnH circuit which is selected for the ADCdesign.Averaging technique is employed in the preamplifier array of ADC to reduce thestatic offsets of preamplifiers. The averaging technique can be made more efficientby using the smaller number of amplifiers. This can be done by using the interpolationtechnique which reduces the number of amplifiers at the input of ADC. Thereduced number of amplifiers is also advantageous for getting higher bandwidthsince the input capacitance at the first stage of preamplifier array is reduced.The flash ADC is designed and implemented in 150 nm CMOS technology for thesampling rate of 1.6 GSamples/sec. The bootstrap TnH consumes power of 27.95mW from a 1.8 V supply and achieves the signal to noise and distortion ratio(SNDR) of 37.38 dB for an input signal frequency of 195.3 MHz. The ADC withideal TnH and comparator consumes power of 78.2 mW and achieves 4.8 effectivenumber of bits (ENOB).
318

Design of Highly Linear Sampling Switches for CMOS Track-and-Hold Circuits

Kazim, Muhammad Irfan January 2006 (has links)
This thesis discusses non-linearities associated with a sampling switch and compares transmission gate, bootstrapping and bulk-effect compensation architectures at circuit level from linearity point of view for 0.35 um CMOS process. All switch architectures have been discussed and designed with an additional constraint of switch reliability. Results indicate that for a specified supply of 3.3 Volts, bulk-effect compensation does not improve third-order harmonic distortion significantly which defines the upper most limit on linearity for a differential topology. However, for low-voltage operations bulk-effect compensation improves third-order harmonic noticeably.
319

Next Generation Ultrashort-Pulse Retrieval Algorithm for Frequency-Resolved Optical Gating: The Inclusion of Random (Noise) and Nonrandom (Spatio-Temporal Pulse Distortions) Error

Wang, Ziyang 14 April 2005 (has links)
A new pulse-retrieval software for Frequency-Resolved Optical Gating (FROG) technique has been developed. The new software extends the capacity of the original FROG algorithm in two major categories. First is a new method to determine the uncertainty of the retrieved pulse field in FROG technique. I proposed a simple, robust, and general technique?tstrap method?ch places error bars on the intensity and phase of the retrieved pulse field. The bootstrap method was also extended to automatically detect ambiguities in the FROG pulse retrieval. The second improvement deals with the spatiotemporal effect of the input laser beam on the measured GRENOUILLE trace. I developed a new algorithm to retrieve the pulse information, which includes both pulse temporal field and the spatiotemporal parameters, from the spatiotemporal distorted GRENOUILLE trace. It is now possible to have a more complete view of an ultrashort pulse. I also proposed a simple method to remove the spatial profile influence of the input laser beam on the GRENOUILLE trace. The new method extends the capacity of GRENOUILLE technique to measure the beams with irregular spatial profiles.
320

The correlation between Heart Rate Variability and Apnea-Hypopnea Index is BMI dependent

Wen, Hsiao-Ting 25 July 2012 (has links)
Great progress has been made in sleep medical research in recent years and sleep medicine has thus evolved into a specialized medical field. Sleep apnea syndrome is one of the mostly commonly seen sleep disorders. It is now clear that sleep apnea has adverse effects on the heart and is a risk factor for several cardiovascular diseases. Studies have found that decreased heart rate variability (HRV) is a prognostic factor for cardiovascular disease and it also associated with higher mortality rate. Considering the confounding effect of BMI and sleep apnea severity, this work investigates the correlation between heart rate variability and AHI (apnea-hypopnea index which is used to characterize the severity of sleep apnea) by dividing patients into different BMI subgroups. This work includes 1068 male subjects with complete overnight ECG recordings. The low-frequency (LF), the high-frequency (HF) component and the LF/HF ratio of HRV are computed for the 10 BMI subgroups. The Bootstrap method and the BCa technique for confidence interval estimation are employed to verify the linear association between the HRV measures and the severity of sleep apnea. The experimental results show that statically significant correlation exist between LF/HF ratio and AHI for BMI ¡Ù28 patient groups. Statically significant correlation between LF and AHI also exists for BMI ¡Ù27 patient groups. These results demonstrate that the associations between some of the HRV measures and AHI are clearly BMI dependent.

Page generated in 0.0386 seconds