• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 6
  • 2
  • 1
  • Tagged with
  • 17
  • 17
  • 7
  • 5
  • 5
  • 5
  • 5
  • 5
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

A Study of the Calibration Regression Model with Censored Lifetime Medical Cost

Lu, Min 03 August 2006 (has links)
Medical cost has received increasing interest recently in Biostatistics and public health. Statistical analysis and inference of life time medical cost have been challenging by the fact that the survival times are censored on some study subjects and their subsequent cost are unknown. Huang (2002) proposed the calibration regression model which is a semiparametric regression tool to study the medical cost associated with covariates. In this thesis, an inference procedure is investigated using empirical likelihood ratio method. The unadjusted and adjusted empirical likelihood confidence regions are constructed for the regression parameters. We compare the proposed empirical likelihood methods with normal approximation based method. Simulation results show that the proposed empirical likelihood ratio method outperforms the normal approximation based method in terms of coverage probability. In particular, the adjusted empirical likelihood is the best one which overcomes the under coverage problem.
12

Jackknife Empirical Likelihood for the Accelerated Failure Time Model with Censored Data

Bouadoumou, Maxime K 15 July 2011 (has links)
Kendall and Gehan estimating functions are used to estimate the regression parameter in accelerated failure time (AFT) model with censored observations. The accelerated failure time model is the preferred survival analysis method because it maintains a consistent association between the covariate and the survival time. The jackknife empirical likelihood method is used because it overcomes computation difficulty by circumventing the construction of the nonlinear constraint. Jackknife empirical likelihood turns the statistic of interest into a sample mean based on jackknife pseudo-values. U-statistic approach is used to construct the confidence intervals for the regression parameter. We conduct a simulation study to compare the Wald-type procedure, the empirical likelihood, and the jackknife empirical likelihood in terms of coverage probability and average length of confidence intervals. Jackknife empirical likelihood method has a better performance and overcomes the under-coverage problem of the Wald-type method. A real data is also used to illustrate the proposed methods.
13

Mean preservation in censored regression using preliminary nonparametric smoothing

Heuchenne, Cédric 18 August 2005 (has links)
In this thesis, we consider the problem of estimating the regression function in location-scale regression models. This model assumes that the random vector (X,Y) satisfies Y = m(X) + s(X)e, where m(.) is an unknown location function (e.g. conditional mean, median, truncated mean,...), s(.) is an unknown scale function, and e is independent of X. The response Y is subject to random right censoring, and the covariate X is completely observed. In the first part of the thesis, we assume that m(x) = E(Y|X=x) follows a polynomial model. A new estimation procedure for the unknown regression parameters is proposed, which extends the classical least squares procedure to censored data. The proposed method is inspired by the method of Buckley and James (1979), but is, unlike the latter method, a non-iterative procedure due to nonparametric preliminary estimation. The asymptotic normality of the estimators is established. Simulations are carried out for both methods and they show that the proposed estimators have usually smaller variance and smaller mean squared error than the Buckley-James estimators. For the second part, suppose that m(.)=E(Y|.) belongs to some parametric class of regression functions. A new estimation procedure for the true, unknown vector of parameters is proposed, that extends the classical least squares procedure for nonlinear regression to the case where the response is subject to censoring. The proposed technique uses new `synthetic' data points that are constructed by using a nonparametric relation between Y and X. The consistency and asymptotic normality of the proposed estimator are established, and the estimator is compared via simulations with an estimator proposed by Stute in 1999. In the third part, we study the nonparametric estimation of the regression function m(.). It is well known that the completely nonparametric estimator of the conditional distribution F(.|x) of Y given X=x suffers from inconsistency problems in the right tail (Beran, 1981), and hence the location function m(x) cannot be estimated consistently in a completely nonparametric way, whenever m(x) involves the right tail of F(.|x) (like e.g. for the conditional mean). We propose two alternative estimators of m(x), that do not share the above inconsistency problems. The idea is to make use of the assumed location-scale model, in order to improve the estimation of F(.|x), especially in the right tail. We obtain the asymptotic properties of the two proposed estimators of m(x). Simulations show that the proposed estimators outperform the completely nonparametric estimator in many cases.
14

LIKELIHOOD INFERENCE FOR LEFT TRUNCATED AND RIGHT CENSORED LIFETIME DATA

Mitra, Debanjan 04 1900 (has links)
<p>Left truncation arises because in many situations, failure of a unit is observed only if it fails after a certain period. In many situations, the units under study may not be followed until all of them fail and the experimenter may have to stop at a certain time when some of the units may still be working. This introduces right censoring into the data. Some commonly used lifetime distributions are lognormal, Weibull and gamma, all of which are special cases of the flexible generalized gamma family. Likelihood inference via the Expectation Maximization (EM) algorithm is used to estimate the model parameters of lognormal, Weibull, gamma and generalized gamma distributions, based on left truncated and right censored data. The asymptotic variance-covariance matrices of the maximum likelihood estimates (MLEs) are derived using the missing information principle. By using the asymptotic variances and the asymptotic normality of the MLEs, asymptotic confidence intervals for the parameters are constructed. For comparison purpose, Newton-Raphson (NR) method is also used for the parameter estimation, and asymptotic confidence intervals corresponding to the NR method and parametric bootstrap are also obtained. Through Monte Carlo simulations, the performance of all these methods of inference are studied. With regard to prediction analysis, the probability that a right censored unit will be working until a future year is estimated, and an asymptotic confidence interval for the probability is then derived by the delta-method. All the methods of inference developed here are illustrated with some numerical examples.</p> / Doctor of Philosophy (PhD)
15

CURE RATE AND DESTRUCTIVE CURE RATE MODELS UNDER PROPORTIONAL ODDS LIFETIME DISTRIBUTIONS

FENG, TIAN January 2019 (has links)
Cure rate models, introduced by Boag (1949), are very commonly used while modelling lifetime data involving long time survivors. Applications of cure rate models can be seen in biomedical science, industrial reliability, finance, manufacturing, demography and criminology. In this thesis, cure rate models are discussed under a competing cause scenario, with the assumption of proportional odds (PO) lifetime distributions for the susceptibles, and statistical inferential methods are then developed based on right-censored data. In Chapter 2, a flexible cure rate model is discussed by assuming the number of competing causes for the event of interest following the Conway-Maxwell (COM) Poisson distribution, and their corresponding lifetimes of non-cured or susceptible individuals can be described by PO model. This provides a natural extension of the work of Gu et al. (2011) who had considered a geometric number of competing causes. Under right censoring, maximum likelihood estimators (MLEs) are obtained by the use of expectation-maximization (EM) algorithm. An extensive Monte Carlo simulation study is carried out for various scenarios, and model discrimination between some well-known cure models like geometric, Poisson and Bernoulli is also examined. The goodness-of-fit and model diagnostics of the model are also discussed. A cutaneous melanoma dataset example is used to illustrate the models as well as the inferential methods. Next, in Chapter 3, the destructive cure rate models, introduced by Rodrigues et al. (2011), are discussed under the PO assumption. Here, the initial number of competing causes is modelled by a weighted Poisson distribution with special focus on exponentially weighted Poisson, length-biased Poisson and negative binomial distributions. Then, a damage distribution is introduced for the number of initial causes which do not get destroyed. An EM-type algorithm for computing the MLEs is developed. An extensive simulation study is carried out for various scenarios, and model discrimination between the three weighted Poisson distributions is also examined. All the models and methods of estimation are evaluated through a simulation study. A cutaneous melanoma dataset example is used to illustrate the models as well as the inferential methods. In Chapter 4, frailty cure rate models are discussed under a gamma frailty wherein the initial number of competing causes is described by a Conway-Maxwell (COM) Poisson distribution in which the lifetimes of non-cured individuals can be described by PO model. The detailed steps of the EM algorithm are then developed for this model and an extensive simulation study is carried out to evaluate the performance of the proposed model and the estimation method. A cutaneous melanoma dataset as well as a simulated data are used for illustrative purposes. Finally, Chapter 5 outlines the work carried out in the thesis and also suggests some problems of further research interest. / Thesis / Doctor of Philosophy (PhD)
16

Confiabilidade em sistemas coerentes: um modelo bayesiano Weibull. / Reliability in coherent systems: a bayesian weibull model

Bhering, Felipe Lunardi 28 June 2013 (has links)
O principal objetivo desse trabalho é introduzir um modelo geral bayesiano Weibull hierárquico para dados censurados que estima a função de confiabilidade de cada componente para sistemas de confiabilidade coerentes. São introduzidos formas de estimação mais sólidas, sem a inserção de estimativas médias nas funções de confiabilidade (estimador plug-in). Através desse modelo, são expostos e solucionados exemplos na área de confiabilidade como sistemas em série, sistemas em paralelo, sistemas k-de-n, sistemas bridge e um estudo clínico com dados censurados intervalares. As soluções consideram que as componentes tem diferentes distribuições, e nesse caso, o sistema bridge ainda não havia solução na literatura. O modelo construído é geral e pode ser utilizado para qualquer sistema coerente e não apenas para dados da área de confiabilidade, como também na área de sobrevivência, dentre outros. Diversas simulações com componentes com diferentes proporções de censura, distintas médias, três tipos de distribuições e tamanhos de amostra foram feitas em todos os sistemas para avaliar a eficácia do modelo. / The main purpose of this work is to introduce a general bayesian Weibull hierarchical model for censored data which estimates each reliability components function from coherent systems. Its introduced estimation procedures which do not consider plug-in estimators. Also, its exposed and solved with this model examples in reliability area such as series systems, parallel systems, k-out-of-n systems, bridge systems and a clinical study with interval censoring data. The problem of bridge system hadnt a solution before for the case of each component with different distribution. Actually, this model is general and can be used to analyse any kind of coherent system and censored data, not only reliability ones, but also survival data and others. Several components simulations with different censored proportions, distinct means, three kinds of distributions and sample size were made in all systems to evaluate model efficiency.
17

Confiabilidade em sistemas coerentes: um modelo bayesiano Weibull. / Reliability in coherent systems: a bayesian weibull model

Felipe Lunardi Bhering 28 June 2013 (has links)
O principal objetivo desse trabalho é introduzir um modelo geral bayesiano Weibull hierárquico para dados censurados que estima a função de confiabilidade de cada componente para sistemas de confiabilidade coerentes. São introduzidos formas de estimação mais sólidas, sem a inserção de estimativas médias nas funções de confiabilidade (estimador plug-in). Através desse modelo, são expostos e solucionados exemplos na área de confiabilidade como sistemas em série, sistemas em paralelo, sistemas k-de-n, sistemas bridge e um estudo clínico com dados censurados intervalares. As soluções consideram que as componentes tem diferentes distribuições, e nesse caso, o sistema bridge ainda não havia solução na literatura. O modelo construído é geral e pode ser utilizado para qualquer sistema coerente e não apenas para dados da área de confiabilidade, como também na área de sobrevivência, dentre outros. Diversas simulações com componentes com diferentes proporções de censura, distintas médias, três tipos de distribuições e tamanhos de amostra foram feitas em todos os sistemas para avaliar a eficácia do modelo. / The main purpose of this work is to introduce a general bayesian Weibull hierarchical model for censored data which estimates each reliability components function from coherent systems. Its introduced estimation procedures which do not consider plug-in estimators. Also, its exposed and solved with this model examples in reliability area such as series systems, parallel systems, k-out-of-n systems, bridge systems and a clinical study with interval censoring data. The problem of bridge system hadnt a solution before for the case of each component with different distribution. Actually, this model is general and can be used to analyse any kind of coherent system and censored data, not only reliability ones, but also survival data and others. Several components simulations with different censored proportions, distinct means, three kinds of distributions and sample size were made in all systems to evaluate model efficiency.

Page generated in 0.049 seconds