1 
Nonparametric predictive inference with rightcensored dataYan, KeJian January 2002 (has links)
This thesis considers nonparametric predictive inference for lifetime data that include rightcensored observations. The assumption A((_m)) proposed by Hill in 1968 provides a partially specified predictive distribution for a future observation given past observations. But it does not allow rightcensored data among the observations. Although Berliner and Hill in 1988 presented a related nonparametric method for dealing with rightcensored data based on A((_n)), they replaced 'exact censoring information' (ECI) by 'partial censoring information' (PCI), enabling inference on the basis of A((_n)). We address if ECI can be used via a generalization of A((_n)).We solve this problem by presenting a new assumption 'rightcensoring A((_n))' (rc A((_n)), which generalizes A((_n)). The assumption rc A((_n)) presents a partially specified predictive distribution for a future observation, given the past observations including rightcensored data, and allows the use of ECI. Based on rcA((_n)), we derive nonparametric predictive inferences (NPI) for a future observation, which can also be applied to a variety of predictive problems formulated in terms of the future observation. As applications of NPI, we discuss grouped data and comparison of two groups of lifetime data, which are problems occurring frequently in reliability and survival analysis.

2 
Tests for Weibull based proportional hazards frailty modelsSarker, Md Shah Jalal January 2002 (has links)
No description available.

3 
Copula Models for Multitype Life History ProcessesDiao, Liqun January 2013 (has links)
This thesis considers statistical issues in the analysis of data in the studies of chronic diseases which involve modeling dependencies between life history processes using copula functions.
Many disease processes feature recurrent events which
represent events arising from an underlying chronic condition; these are often modeled as point processes.
In addition, however, there often exists a random variable which is realized upon the occurrence of each event, which is called a mark of the point process. When considered together, such processes are called marked point processes. A novel copula model for the marked point process is described here which uses copula functions to govern the association between marks and event times. Specifically, a copula function is used to link each mark with the next event time following the realization of that mark to reflect the pattern in the data wherein larger marks are often followed by longer time to the next event.
The extent of organ damage in an individual can often be characterized by ordered states, and interest frequently lies in modeling the rates at which individuals progress through these states. Risk factors can be studied and the effect of therapeutic interventions can be assessed based on relevant multistate models. When chronic diseases affect multiple organ systems, joint modeling of progression in several organ systems is also important.
In contrast to common intensitybased or frailtybased approaches to modelling, this thesis considers a copulabased framework for modeling and analysis. Through decomposition of the density and by use of conditional independence assumptions, an appealing joint model is obtained by assuming that the joint survival function of absorption transition times is governed by a multivariate copula function. Different approaches to estimation and inference are discussed and compared including composite likelihood and twostage estimation methods. Special attention is paid to the case of intervalcensored data arising from intermittent assessment.
Attention is also directed to use of copula models for more general scenarios with a focus on semiparametric twostage estimation procedures. In this approach nonparametric or semiparametric estimates of the marginal survivor functions are obtained in the first stage and estimates of the association parameters are obtained in the second stage. Bivariate failure time models are considered for data under rightcensoring and current status observation schemes, and rightcensored multistate models. A new expression for the asymptotic variance of the secondstage estimator for the association parameter along with a way of estimating this for finite samples are presented under these models and observation schemes.

4 
Copula Models for Multitype Life History ProcessesDiao, Liqun January 2013 (has links)
This thesis considers statistical issues in the analysis of data in the studies of chronic diseases which involve modeling dependencies between life history processes using copula functions.
Many disease processes feature recurrent events which
represent events arising from an underlying chronic condition; these are often modeled as point processes.
In addition, however, there often exists a random variable which is realized upon the occurrence of each event, which is called a mark of the point process. When considered together, such processes are called marked point processes. A novel copula model for the marked point process is described here which uses copula functions to govern the association between marks and event times. Specifically, a copula function is used to link each mark with the next event time following the realization of that mark to reflect the pattern in the data wherein larger marks are often followed by longer time to the next event.
The extent of organ damage in an individual can often be characterized by ordered states, and interest frequently lies in modeling the rates at which individuals progress through these states. Risk factors can be studied and the effect of therapeutic interventions can be assessed based on relevant multistate models. When chronic diseases affect multiple organ systems, joint modeling of progression in several organ systems is also important.
In contrast to common intensitybased or frailtybased approaches to modelling, this thesis considers a copulabased framework for modeling and analysis. Through decomposition of the density and by use of conditional independence assumptions, an appealing joint model is obtained by assuming that the joint survival function of absorption transition times is governed by a multivariate copula function. Different approaches to estimation and inference are discussed and compared including composite likelihood and twostage estimation methods. Special attention is paid to the case of intervalcensored data arising from intermittent assessment.
Attention is also directed to use of copula models for more general scenarios with a focus on semiparametric twostage estimation procedures. In this approach nonparametric or semiparametric estimates of the marginal survivor functions are obtained in the first stage and estimates of the association parameters are obtained in the second stage. Bivariate failure time models are considered for data under rightcensoring and current status observation schemes, and rightcensored multistate models. A new expression for the asymptotic variance of the secondstage estimator for the association parameter along with a way of estimating this for finite samples are presented under these models and observation schemes.

5 
Statistical analysis of lifetime data using new modified Weibull distributionsAlMalki, Saad Jamaan January 2014 (has links)
The Weibull distribution is a popular and widely used distribution in reliability and in lifetime data analysis. Since 1958, the Weibull distribution has been modified by many researchers to allow for nonmonotonic hazard functions. Many modifications of the Weibull distribution have achieved the above purpose. On the other hand, the number of parameters has increased, the forms of the survival and hazard functions have become more complicated and the estimation problems have risen. This thesis provides an extensive review of some discrete and continuous versions of the modifications of the Weibull distribution, which could serve as an important reference and encourage further modifications of the Weibull distribution. Four different modifications of the Weibull distribution are proposed to address some of the above problems using different techniques. First model, with five parameters, is constructed by considering a twocomponent serial system with one component following a Weibull distribution and another following a modified Weibull distribution. A new method has been proposed to reduce the number of parameters of the new modified Weibull distribution from five to three parameters to simplify the distribution and address the estimation problems. The reduced version has the same desirable properties of the original distribution in spite of having two less parameters. It can be an alternative distribution for all modifications of the Weibull distribution with bathtub shaped hazard rate functions. To deal with unimodal shaped hazard rates, the third model with four parameters, named as the exponentiated reduced modified Weibull distribution is introduced. This model is flexible, has a nice physical interpretation and has the ability to capture monotonically increasing, unimodal and bathtub shaped hazard rates. It is a generalization of the reduced modified Weibull distribution. The proposed distribution gives the best fit comparing to other modifications of the Weibull distribution including those having similar properties. A threeparameter discrete distribution is introduced based on the reduced distribution. It is one of only three discrete distributions allowing for bathtub shaped hazard rate functions. Four real data sets have applied to this distribution. The new distribution is shown to outperform at least three other models including the ones allowing for bathtub shaped hazard rate functions. The new models show flexibility and can be used to model different kinds of real data sets better than other modified versions of Weibull distribution including those having the same number of parameters. The mathematical properties and statistical inferences of the new models are studied. Based on a simulation study the performances of the MLEs of each model are assessed with respect to sample size n. We find no evidence that the generalized modified Weibull distribution can provide a better fit than the exponentiated Weibull distributionfor data sets exhibiting the modified unimodal hazard function.

6 
Analysis of Reliability Experiments with Random Blocks and SubsamplingKensler, Jennifer Lin Karam 09 August 2012 (has links)
Reliability experiments provide important information regarding the life of a product, including how various factors may affect product life. Current analyses of reliability data usually assume a completely randomized design. However, reliability experiments frequently contain subsampling which is a restriction on randomization. A typical experiment involves applying treatments to test stands, with several items placed on each test stand. In addition, raw materials used in experiments are often produced in batches. In some cases one batch may not be large enough to provide materials for the entire experiment and more than one batch must be used. These batches lead to a design involving blocks. This dissertation proposes two methods for analyzing reliability experiments with random blocks and subsampling. The first method is a twostage method which can be implemented in software used by most practitioners, but has some limitations. Therefore, a more rigorous nonlinear mixed model method is proposed. / Ph. D.

7 
Cumulative Sum Control Charts for Censored Reliability DataOlteanu, Denisa Anca 28 April 2010 (has links)
Companies routinely perform life tests for their products. Typically, these tests involve running a set of products until the units fail. Most often, the data are censored according to different censoring schemes, depending on the particulars of the test. On occasion, tests are stopped at a predetermined time and the units that are yet to fail are suspended. In other instances, the data are collected through periodic inspection and only upper and lower bounds on the lifetimes are recorded. Reliability professionals use a number of nonnormal distributions to model the resulting lifetime data with the Weibull distribution being the most frequently used. If one is interested in monitoring the quality and reliability characteristics of such processes, one needs to account for the challenges imposed by the nature of the data. We propose likelihood ratio based cumulative sum (CUSUM) control charts for censored lifetime data with nonnormal distributions. We illustrate the development and implementation of the charts, and we evaluate their properties through simulation studies. We address the problem of interval censoring, and we construct a CUSUM chart for censored ordered categorical data, which we illustrate by a case study at Becton Dickinson (BD). We also address the problem of monitoring both of the parameters of the Weibull distribution for processes with rightcensored data. / Ph. D.

8 
Statistical Methods for Reliability Data from Designed ExperimentsFreeman, Laura J. 07 May 2010 (has links)
Product reliability is an important characteristic for all manufacturers, engineers and consumers. Industrial statisticians have been planning experiments for years to improve product quality and reliability. However, rarely do experts in the field of reliability have expertise in design of experiments (DOE) and the implications that experimental protocol have on data analysis. Additionally, statisticians who focus on DOE rarely work with reliability data. As a result, analysis methods for lifetime data for experimental designs that are more complex than a completely randomized design are extremely limited. This dissertation provides two new analysis methods for reliability data from life tests. We focus on data from a subsampling experimental design. The new analysis methods are illustrated on a popular reliability data set, which contains subsampling. Monte Carlo simulation studies evaluate the capabilities of the new modeling methods. Additionally, Monte Carlo simulation studies highlight the principles of experimental design in a reliability context. The dissertation provides multiple methods for statistical inference for the new analysis methods. Finally, implications for the reliability field are discussed, especially in future applications of the new analysis methods. / Ph. D.

9 
Bridging the Gap: Selected Problems in Model Specification, Estimation, and Optimal Design from Reliability and Lifetime Data AnalysisKing, Caleb B. 13 April 2015 (has links)
Understanding the lifetime behavior of their products is crucial to the success of any company in the manufacturing and engineering industries. Statistical methods for lifetime data are a key component to achieving this level of understanding. Sometimes a statistical procedure must be updated to be adequate for modeling specific data as is discussed in Chapter 2. However, there are cases in which the methods used in industrial standards are themselves inadequate. This is distressing as more appropriate statistical methods are available but remain unused. The research in Chapter 4 deals with such a situation. The research in Chapter 3 serves as a combination of both scenarios and represents how both statisticians and engineers from the industry can join together to yield beautiful results.
After introducing basic concepts and notation in Chapter 1, Chapter 2 focuses on lifetime prediction for a product consisting of multiple components. During the production period, some components may be upgraded or replaced, resulting in a new ``generation" of component. Incorporating this information into a competing risks model can greatly improve the accuracy of lifetime prediction. A generalized competing risks model is proposed and simulation is used to assess its performance.
In Chapter 3, optimal and compromise test plans are proposed for constant amplitude fatigue testing. These test plans are based on a nonlinear physical model from the fatigue literature that is able to better capture the nonlinear behavior of fatigue life and account for effects from the testing environment. Sensitivity to the design parameters and modeling assumptions are investigated and suggestions for planning strategies are proposed.
Chapter 4 considers the analysis of ADDT data for the purposes of estimating a thermal index. The current industry standards use a twostep procedure involving least squares regression in each step. The methodology preferred in the statistical literature is the maximum likelihood procedure. A comparison of the procedures is performed and two published datasets are used as motivating examples. The maximum likelihood procedure is presented as a more viable alternative to the twostep procedure due to its ability to quantify uncertainty in data inference and modeling flexibility. / Ph. D.

10 
LIKELIHOOD INFERENCE FOR LEFT TRUNCATED AND RIGHT CENSORED LIFETIME DATAMitra, Debanjan 04 1900 (has links)
<p>Left truncation arises because in many situations, failure of a unit is observed only if it fails after a certain period. In many situations, the units under study may not be followed until all of them fail and the experimenter may have to stop at a certain time when some of the units may still be working. This introduces right censoring into the data. Some commonly used lifetime distributions are lognormal, Weibull and gamma, all of which are special cases of the flexible generalized gamma family. Likelihood inference via the Expectation Maximization (EM) algorithm is used to estimate the model parameters of lognormal, Weibull, gamma and generalized gamma distributions, based on left truncated and right censored data. The asymptotic variancecovariance matrices of the maximum likelihood estimates (MLEs) are derived using the missing information principle. By using the asymptotic variances and the asymptotic normality of the MLEs, asymptotic confidence intervals for the parameters are constructed. For comparison purpose, NewtonRaphson (NR) method is also used for the parameter estimation, and asymptotic confidence intervals corresponding to the NR method and parametric bootstrap are also obtained. Through Monte Carlo simulations, the performance of all these methods of inference are studied. With regard to prediction analysis, the probability that a right censored unit will be working until a future year is estimated, and an asymptotic confidence interval for the probability is then derived by the deltamethod. All the methods of inference developed here are illustrated with some numerical examples.</p> / Doctor of Philosophy (PhD)

Page generated in 0.0618 seconds