• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 35
  • 15
  • 5
  • 3
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 283
  • 283
  • 101
  • 98
  • 81
  • 67
  • 67
  • 45
  • 39
  • 38
  • 37
  • 37
  • 35
  • 32
  • 30
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
71

LIKELIHOOD-BASED INFERENTIAL METHODS FOR SOME FLEXIBLE CURE RATE MODELS

Pal, Suvra 04 1900 (has links)
<p>Recently, the Conway-Maxwell Poisson (COM-Poisson) cure rate model has been proposed which includes as special cases some of the well-known cure rate models discussed in the literature. Data obtained from cancer clinical trials are often right censored and the expectation maximization (EM) algorithm can be efficiently used for the determination of the maximum likelihood estimates (MLEs) of the model parameters based on right censored data.</p> <p>By assuming the lifetime distribution to be exponential, lognormal, Weibull, and gamma, the necessary steps of the EM algorithm are developed for the COM-Poisson cure rate model and some of its special cases. The inferential method is examined by means of an extensive simulation study. Model discrimination within the COM-Poisson family is carried out by likelihood ratio test as well as by information-based criteria. Finally, the proposed method is illustrated with a cutaneous melanoma data on cancer recurrence. As the lifetime distributions considered are not nested, it is not possible to carry out a formal statistical test to determine which among these provides an adequate fit to the data. For this reason, the wider class of generalized gamma distributions is considered which contains all of the above mentioned lifetime distributions as special cases. The steps of the EM algorithm are then developed for this general class of distributions and a simulation study is carried out to evaluate the performance of the proposed estimation method. Model discrimination within the generalized gamma family is carried out by likelihood ratio test and information-based criteria. Finally, for the considered cutaneous melanoma data, the two-way flexibility of the COM-Poisson family and the generalized gamma family is utilized to carry out a two-way model discrimination to select a parsimonious competing cause distribution along with a suitable choice of a lifetime distribution that provides the best fit to the data.</p> / Doctor of Philosophy (PhD)
72

STATISTICAL AND METHODOLOGICAL ISSUES IN EVALUATION OF INTEGRATED CARE PROGRAMS

Ye, Chenglin January 2014 (has links)
<p><strong>Background </strong></p> <p>Integrated care programs are collaborations to improve health services delivery for patients with multiple conditions.</p> <p><strong>Objectives</strong></p> <p>This thesis investigated three issues in evaluation of integrated care programs: (1) quantifying integration for integrated care programs, (2) analyzing integrated care programs with substantial non-compliance, and (3) assessing bias when evaluating integrated care programs under different non-compliant scenarios.</p> <p><strong>Methods</strong></p> <p>Project 1: We developed a method to quantity integration through service providers’ perception and expectation. For each provider, four integration scores were calculated. The properties of the scores were assessed.</p> <p>Project 2: A randomized controlled trial (RCT) compared the Children’s Treatment Network (CTN) with usual care on managing the children with complex conditions. To handle non-compliance, we employed the intention-to-treat (ITT), as-treated (AT), per-protocol (PP), and instrumental variable (IV) analyses. We also investigated propensity score (PS) methods to control for potential confounding.</p> <p>Project 3: Based on the CTN study, we simulated trials of different non-compliant scenarios. We then compared the ITT, AT, PP, IV, and complier average casual effect methods in analyzing the data. The results were compared by the bias of the estimate, mean square error, and 95% coverage.</p> <p><strong>Results and conclusions</strong></p> <p>Project 1: We demonstrated the proposed method in measuring integration and some of its properties. By bootstrapping analyses, we showed that the global integration score was robust. Our method has extended existing measures of integration and possesses a good extent of validity.</p> <p>Project 2: The CTN intervention was not significantly different from usual care on improving patients’ outcomes. The study highlighted some methodological challenges in evaluating integrated care programs in a RCT setting.</p> <p>Project 3: When an intervention had a moderate or large effect, the ITT analysis was considerably biased under non-compliance and alternative analyses could provide unbiased results. To minimize the bias, we make some recommendations for the choice of analyses under different scenarios.</p> / Doctor of Philosophy (PhD)
73

Statistical Methods for Handling Intentional Inaccurate Responders

McQuerry, Kristen J. 01 January 2016 (has links)
In self-report data, participants who provide incorrect responses are known as intentional inaccurate responders. This dissertation provides statistical analyses for address intentional inaccurate responses in the data. Previous work with adolescent self-report, labeled survey participants who intentionally provide inaccurate answers as mischievous responders. This phenomenon also occurs in clinical research. For example, pregnant women who smoke may report that they are nonsmokers. Our advantage is that we do not solely have self-report answers and can verify responses with lab values. Currently, there is no clear method for handling these intentional inaccurate respondents when it comes to making statistical inferences. We propose a using an EM algorithm to account for the intentional behavior while maintaining all responses in the data. The performance of this model is evaluated using simulated data and real data. The strengths and weaknesses of the EM algorithm approach will be demonstrated.
74

EMPIRICAL LIKELIHOOD AND DIFFERENTIABLE FUNCTIONALS

Shen, Zhiyuan 01 January 2016 (has links)
Empirical likelihood (EL) is a recently developed nonparametric method of statistical inference. It has been shown by Owen (1988,1990) and many others that empirical likelihood ratio (ELR) method can be used to produce nice confidence intervals or regions. Owen (1988) shows that -2logELR converges to a chi-square distribution with one degree of freedom subject to a linear statistical functional in terms of distribution functions. However, a generalization of Owen's result to the right censored data setting is difficult since no explicit maximization can be obtained under constraint in terms of distribution functions. Pan and Zhou (2002), instead, study the EL with right censored data using a linear statistical functional constraint in terms of cumulative hazard functions. In this dissertation, we extend Owen's (1988) and Pan and Zhou's (2002) results subject to non-linear but Hadamard differentiable statistical functional constraints. In this purpose, a study of differentiable functional with respect to hazard functions is done. We also generalize our results to two sample problems. Stochastic process and martingale theories will be applied to prove the theorems. The confidence intervals based on EL method are compared with other available methods. Real data analysis and simulations are used to illustrate our proposed theorem with an application to the Gini's absolute mean difference.
75

EMPIRICAL PROCESSES AND ROC CURVES WITH AN APPLICATION TO LINEAR COMBINATIONS OF DIAGNOSTIC TESTS

Chirila, Costel 01 January 2008 (has links)
The Receiver Operating Characteristic (ROC) curve is the plot of Sensitivity vs. 1- Specificity of a quantitative diagnostic test, for a wide range of cut-off points c. The empirical ROC curve is probably the most used nonparametric estimator of the ROC curve. The asymptotic properties of this estimator were first developed by Hsieh and Turnbull (1996) based on strong approximations for quantile processes. Jensen et al. (2000) provided a general method to obtain regional confidence bands for the empirical ROC curve, based on its asymptotic distribution. Since most biomarkers do not have high enough sensitivity and specificity to qualify for good diagnostic test, a combination of biomarkers may result in a better diagnostic test than each one taken alone. Su and Liu (1993) proved that, if the panel of biomarkers is multivariate normally distributed for both diseased and non-diseased populations, then the linear combination, using Fisher's linear discriminant coefficients, maximizes the area under the ROC curve of the newly formed diagnostic test, called the generalized ROC curve. In this dissertation, we will derive the asymptotic properties of the generalized empirical ROC curve, the nonparametric estimator of the generalized ROC curve, by using the empirical processes theory as in van der Vaart (1998). The pivotal result used in finding the asymptotic behavior of the proposed nonparametric is the result on random functions which incorporate estimators as developed by van der Vaart (1998). By using this powerful lemma we will be able to decompose an equivalent process into a sum of two other processes, usually called the brownian bridge and the drift term, via Donsker classes of functions. Using a uniform convergence rate result given by Pollard (1984), we derive the limiting process of the drift term. Due to the independence of the random samples, the asymptotic distribution of the generalized empirical ROC process will be the sum of the asymptotic distributions of the decomposed processes. For completeness, we will first re-derive the asymptotic properties of the empirical ROC curve in the univariate case, using the same technique described before. The methodology is used to combine biomarkers in order to discriminate lung cancer patients from normals.
76

STATISTICAL METHODS IN MICROARRAY DATA ANALYSIS

Huang, Liping 01 January 2009 (has links)
This dissertation includes three topics. First topic: Regularized estimation in the AFT model with high dimensional covariates. Second topic: A novel application of quantile regression for identification of biomarkers exemplified by equine cartilage microarray data. Third topic: Normalization and analysis of cDNA microarray using linear contrasts.
77

NFL Betting Market: Using Adjusted Statistics to Test Market Efficiency and Build a Betting Model

Donnelly, James P 01 January 2013 (has links)
The use of statistical analysis has been prevalent in the sports gambling industry for years. More recently, we have seen the emergence of "adjusted statistics", a more sophisticated way to examine each play and each result (further explanation below). And while adjusted statistics have become commonplace for professional and recreational bettors alike, little research has been done to justify their use. In this paper the effectiveness of this data is tested on the most heavily wagered sport in the world – the National Football League (NFL). The results are studied with two central questions in mind: Does the market account for the information provided by adjusted statistics? And, can this data be interpreted to create a profitable betting strategy? First, the Efficient Market Hypothesis is introduced and tested using these new variables. Then, a betting model is built and tested.
78

Reliability applied to maintenance

Sherwin, David J. January 1979 (has links)
The thesis covers studies conducted during 1976-79 under a Science Research Council contract to examine the uses of reliability information in decision-making in maintenance in the process industries. After a discussion of the ideal data system, four practical studies of process plants are described involving both Pareto and distribution analysis. In two of these studies the maintenance policy was changed and the effect on failure modes and frequency observed. Hyper-exponentially distributed failure intervals were found to be common and were explained after observation of maintenance work practices and development of theory as being due to poor workmanship and parts. The fallacy that constant failure rate necessarily implies the optimality of maintenance only at failure is discussed. Two models for the optimisation of inspection intervals are developed; both assume items give detectable warning of impending failure. The first is based upon constant risk of failure between successive inspections 'and Weibull base failure distribution~ Results show that an inspection/on-condition maintenance regime can be cost effective even when the failure rate is falling and may be better than periodiC renewals for an increasing failure situation. The second model is first-order Markov. Transition rate matrices are developed and solved to compare continuous monitoring with inspections/on-condition maintenance an a cost basis. The models incorporate planning delay in starting maintenance after impending failure is detected. The relationships between plant output and maintenance policy as affected by the presence of redundancy and/or storage between stages are examined, mainly through the literature but with some original theoretical proposals. It is concluded that reliability techniques have many applications in the improvement of plant maintenance policy. Techniques abound, but few firms are willing to take the step of faith to set up, even temporarily, the data-collection facilities required to apply them. There are over 350 references, many of which are reviewed in the text, divided into chapter-related sectionso Appendices include a review of Reliability Engineering Theory, based on the author's draft for BS 5760(2) a discussion of the 'bath-tub curves' applicability to maintained systems and the theory connecting hyper-exponentially distributed failures with poor maintenance practices.
79

Application of the Fisher Dimer Model to DNA Condensation

Baker, John C, III 01 January 2017 (has links)
This paper considers the statistical mechanics occupation of the edge of a single helix of DNA by simple polymers. Using Fisher's exact closed form solution for dimers on a two-dimensional lattice, a one-dimensional lattice is created mathematically that is occupied by dimers, monomers, and holes. The free energy, entropy, average occupation, and total charge on the lattice are found through the usual statistical methods. The results demonstrate the charge inversion required for a DNA helix to undergo DNA condensation.
80

Dresdner Beiträge zu Quantitativen Verfahren

30 March 2017 (has links)
No description available.

Page generated in 0.0919 seconds