1 |
Bayesian Cox Models for Interval-Censored Survival DataZhang, Yue January 2016 (has links)
No description available.
|
2 |
Comparison of proportional hazards and accelerated failure time modelsQi, Jiezhi 30 March 2009
The field of survival analysis has experienced tremendous growth during the latter half of the 20th century. The methodological developments of survival analysis that have had the most profound impact are the Kaplan-Meier method for estimating the survival function, the log-rank test for comparing the equality of two or more survival distributions, and the Cox proportional hazards (PH) model for examining the covariate effects on the hazard function. The accelerated failure time (AFT) model was proposed but seldom used. In this thesis, we present the basic concepts, nonparametric methods (the Kaplan-Meier method and the log-rank test), semiparametric methods (the Cox PH model, and Cox model with time-dependent covariates) and parametric methods (Parametric PH model and the AFT model) for analyzing survival data.<p>
We apply these methods to a randomized placebo-controlled trial to prevent Tuberculosis (TB) in Ugandan adults infected with Human Immunodificiency Virus (HIV). The objective of the analysis is to determine whether TB preventive therapies affect the rate of AIDS progression and survival in HIV-infected adults. Our conclusion is that TB preventive therapies appear to have no effect on AIDS progression, death and combined event of AIDS progression and death. The major goal of this paper is to support an argument for the consideration of the AFT model as an alternative to the PH model in the analysis of some survival data by means of this real dataset. We critique the PH model and assess the lack of fit. To overcome the violation of proportional hazards, we use the Cox model with time-dependent covariates, the piecewise exponential model and the accelerated failure time model. After comparison of all the models and the assessment of goodness-of-fit, we find that the log-logistic AFT model fits better for this data set. We have seen that the AFT model is a more valuable and realistic alternative to the PH model in some situations. It can provide the predicted hazard functions, predicted survival functions, median survival times and time ratios. The AFT model can easily interpret the results into the effect upon the expected median duration of illness for a patient in a clinical setting. We suggest that the PH model may not be appropriate in some situations and that the AFT model could provide a more appropriate description of the data.
|
3 |
Comparison of proportional hazards and accelerated failure time modelsQi, Jiezhi 30 March 2009 (has links)
The field of survival analysis has experienced tremendous growth during the latter half of the 20th century. The methodological developments of survival analysis that have had the most profound impact are the Kaplan-Meier method for estimating the survival function, the log-rank test for comparing the equality of two or more survival distributions, and the Cox proportional hazards (PH) model for examining the covariate effects on the hazard function. The accelerated failure time (AFT) model was proposed but seldom used. In this thesis, we present the basic concepts, nonparametric methods (the Kaplan-Meier method and the log-rank test), semiparametric methods (the Cox PH model, and Cox model with time-dependent covariates) and parametric methods (Parametric PH model and the AFT model) for analyzing survival data.<p>
We apply these methods to a randomized placebo-controlled trial to prevent Tuberculosis (TB) in Ugandan adults infected with Human Immunodificiency Virus (HIV). The objective of the analysis is to determine whether TB preventive therapies affect the rate of AIDS progression and survival in HIV-infected adults. Our conclusion is that TB preventive therapies appear to have no effect on AIDS progression, death and combined event of AIDS progression and death. The major goal of this paper is to support an argument for the consideration of the AFT model as an alternative to the PH model in the analysis of some survival data by means of this real dataset. We critique the PH model and assess the lack of fit. To overcome the violation of proportional hazards, we use the Cox model with time-dependent covariates, the piecewise exponential model and the accelerated failure time model. After comparison of all the models and the assessment of goodness-of-fit, we find that the log-logistic AFT model fits better for this data set. We have seen that the AFT model is a more valuable and realistic alternative to the PH model in some situations. It can provide the predicted hazard functions, predicted survival functions, median survival times and time ratios. The AFT model can easily interpret the results into the effect upon the expected median duration of illness for a patient in a clinical setting. We suggest that the PH model may not be appropriate in some situations and that the AFT model could provide a more appropriate description of the data.
|
4 |
Dynamiques épidémiques, risques et copules / Epidemic dynamics, risk and copulasGhassani, Mohamad 30 November 2012 (has links)
Les modèles stochastiques classiques comportent des copules d'interactions linéaires, exprimant en général des interactions de paire. Il sera envisagé d'étendre ces modèles à des interactions non linéaires de type saturation ou de type triplet, en vue de traiter des applications réalistes, comme les diffusions épidémiques.Le but de cette thèse est d'introduire les fonctions copules en épidémiologie, et surtout d'appliquer ces fonctions sur le système de transmission de la Malaria afin de constater la dépendance entre les différents compartiments du système. Nous étudierons quelques modèles compartimentaux, qui sont une généralisation du modèle de Ross-Macdonald, en supposant que la population n'est pas constante et en prenant en compte des paramètres de transmission comme la fécondité, la mortalité et autres. Aussi, nous introduirons les classes d'âges dans certains de ces modèles compartimentaux, afin de trouver une relation entre les individus de ces classes d'âges à l'aide du modèle de Cox et des fonctions copules. Nous donnerons ensuite, deux exemples sur ces modèles : la Malaria au Mali et la peste en Europe au moyen-âge. Nous introduirons aussi les quantiles conditionnels et les fonctions copules archimédiennes, ce qui nous mènera à trouver une dépendance entre les différents compartiments des hôtes et des vecteurs. / The stochastic classical models include linear interactions copulas, expressing in general pair interactions. It is planned to extend these models to nonlinear interactions of saturation type or triplet type, to treat realistic applications, as the epidemics diffusions.The aim of this thesis is to apply the copulas functions in epidemiology, and especially to apply these functions in the transmission system of malaria to detect the dependence existing between compartments of the epidemic system. We will study some compartmental models, which are a generalization of the Ross-Macdonald model, assuming that the population is not constant and taking into account the transmission parameters such as fertility, mortality, etc. Also, we will introduce the age classes in some of these compartmental models, and study the relationships between individuals of these age classes, using the Cox model and the copulas functions. Then, we will give two examples of these models: the Malaria in Mali and the plague in Europe during the Middle Ages. We will introduce also the conditional quantiles and the Archimedean copulas functions, that will lead us to find dependencies between the different compartments of hosts and vectors.
|
5 |
An Evaluation of Methods for Assessing the Functional Form of Covariates in the Cox ModelKarlsson, Linnea January 2016 (has links)
In this thesis, two methods for assessing the functional form of covariates in the Cox proportional hazards model are evaluated. The methods include one graphical check based on martingale residuals and one graphical check, together with a Kolmogorov-type supremum test, based on cumulative sums of martingale residuals. The methods are evaluated in a simulation study under five different covariate misspecifications with varying sample sizes and censoring degrees. The results from both methods indicate that the type of covariate misspecification, sample size and censoring degree affect the ability to detect and identify the misspecification. The procedure based on smoothed scatterplots of martingale residuals reveals difficulties with assessing whether the behaviour of the smoothed curve in the plot is an indication of a misspecification or a phenomenon that can occur in a correctly specified model. The graphical check together with the test procedure based on cumulative sums of martingale residuals is shown to successfully detect and identify three out of five covariate misspecifications for large sample sizes. For small sample sizes, especially combined with a high censoring degree, the power of the supremum test is low for all covariate misspecifications.
|
6 |
Variable Selection in Competing Risks Using the L1-Penalized Cox ModelKong, XiangRong 22 September 2008 (has links)
One situation in survival analysis is that the failure of an individual can happen because of one of multiple distinct causes. Survival data generated in this scenario are commonly referred to as competing risks data. One of the major tasks, when examining survival data, is to assess the dependence of survival time on explanatory variables. In competing risks, as with ordinary univariate survival data, there may be explanatory variables associated with the risks raised from the different causes being studied. The same variable might have different degrees of influence on the risks due to different causes. Given a set of explanatory variables, it is of interest to identify the subset of variables that are significantly associated with the risk corresponding to each failure cause. In this project, we develop a statistical methodology to achieve this purpose, that is, to perform variable selection in the presence of competing risks survival data. Asymptotic properties of the model and empirical simulation results for evaluation of the model performance are provided. One important feature of our method, which is based on the idea of the L1 penalized Cox model, is the ability to perform variable selection in situations where we have high-dimensional explanatory variables, i.e. the number of explanatory variables is larger than the number of observations. The method was applied on a real dataset originated from the National Institutes of Health funded project "Genes related to hepatocellular carcinoma progression in living donor and deceased donor liver transplant'' to identify genes that might be relevant to tumor progression in hepatitis C virus (HCV) infected patients diagnosed with hepatocellular carcinoma (HCC). The gene expression was measured on Affymetrix GeneChip microarrays. Based on the current available 46 samples, 42 genes show very strong association with tumor progression and deserve to be further investigated for their clinical implications in prognosis of progression on patients diagnosed with HCV and HCC.
|
7 |
Grouped variable selection in high dimensional partially linear additive Cox modelLiu, Li 01 December 2010 (has links)
In the analysis of survival outcome supplemented with both clinical information and high-dimensional gene expression data, traditional Cox proportional hazard model fails to meet some emerging needs in biological research. First, the number of covariates is generally much larger the sample size. Secondly, predicting an outcome with individual gene expressions is inadequate because a gene's expression is regulated by multiple biological processes and functional units. There is a need to understand the impact of changes at a higher level such as molecular function, cellular component, biological process, or pathway. The change at a higher level is usually measured with a set of gene expressions related to the biological process. That is, we need to model the outcome with gene sets as variable groups and the gene sets could be partially overlapped also.
In this thesis work, we investigate the impact of a penalized Cox regression procedure on regularization, parameter estimation, variable group selection, and nonparametric modeling of nonlinear eects with a time-to-event outcome.
We formulate the problem as a partially linear additive Cox model with high-dimensional data. We group genes into gene sets and approximate the nonparametric components by truncated series expansions with B-spline bases. After grouping and approximation, the problem of variable selection becomes that of selecting groups of coecients in a gene set or in an approximation. We apply the group Lasso to obtain an initial solution path and reduce the dimension of the problem and then update the whole solution path with the adaptive group Lasso. We also propose a generalized group lasso method to provide more freedom in specifying the penalty and excluding covariates from being penalized.
A modied Newton-Raphson method is designed for stable and rapid computation. The core programs are written in the C language. An user-friendly R interface is implemented to perform all the calculations by calling the core programs.
We demonstrate the asymptotic properties of the proposed methods. Simulation studies are carried out to evaluate the finite sample performance of the proposed procedure using several tuning parameter selection methods for choosing the point on the solution path as the nal estimator. We also apply the proposed approach on two real data examples.
|
8 |
A New Screening Methodology for Mixture ExperimentsWeese, Maria 01 May 2010 (has links)
Many materials we use in daily life are comprised of a mixture; plastics, gasoline, food, medicine, etc. Mixture experiments, where factors are proportions of components and the response depends only on the relative proportions of the components, are an integral part of product development and improvement. However, when the number of components is large and there are complex constraints, experimentation can be a daunting task. We study screening methods in a mixture setting using the framework of the Cox mixture model [1]. We exploit the easy interpretation of the parameters in the Cox mixture model and develop methods for screening in a mixture setting. We present specific methods for adding a component, removing a component and a general method for screening a subset of components in mixtures with complex constraints. The variances of our parameter estimates are comparable with the typically used Scheff ́e model variances and our methods provide a reduced run size for screening experiments with mixtures containing a large number of components. We then further extend the new screening methods by using Evolutionary Operation (EVOP) developed by Box and Draper [2]. EVOP methods use small movement in a subset of process parameters and replication to reveal effects out of the process noise. Mixture experiments inherently have small movements (since the proportions can only range from zero to unity) and the effects have large variances. We update the EVOP methods by using sequential testing of effects opposed to the confidence interval method originally proposed by Box and Draper. We show that the sequential testing approach as compared with a fixed sample size reduced the required sample size as much as 50 percent with all other testing parameters held constant. We present two methods for adding a component and a general screening method using a graphical sequential t-test and provide R-code to reproduce the limits for the test.
|
9 |
Semiparametric Methods for the Analysis of Progression-Related EndpointsBoruvka, Audrey January 2013 (has links)
Use of progression-free survival in the evaluation of clinical interventions is hampered by a variety of issues, including censoring patterns not addressed in the usual methods for survival analysis. Progression can be right-censored before survival or interval-censored between inspection times. Current practice calls for imputing events to their time of detection. Such an approach is prone to bias, underestimates standard errors and makes inefficient use of the data at hand. Moreover a composite outcome prevents inference about the actual treatment effect on the risk of progression. This thesis develops semiparametric and sieve maximum likelihood estimators to more formally analyze progression-related endpoints. For the special case where death rarely precedes progression, a Cox-Aalen model is proposed for regression analysis of time-to-progression under intermittent inspection. The general setting considering both progression and survival is examined with a Markov Cox-type illness-death model under various censoring schemes. All of the resulting estimators globally converge to the truth slower than the parametric rate, but their finite-dimensional components are asymptotically efficient. Numerical studies suggest that the new methods perform better than their imputation-based alternatives under moderate to large samples having higher rates of censoring.
|
10 |
Test des effets centre en épidémiologie clinique / Testing for centre effects in clinical epidemiologyBiard, Lucie 25 November 2016 (has links)
La modélisation des effets centre dans le cadre des données de survie repose souvent sur l'utilisation de modèles de Cox à effets mixtes. Tester un effet centre revient alors à tester à zéro la variance de l'effet aléatoire correspondant. La distribution sous l'hypothèse nulle des statistiques des tests paramétriques usuels n'est alors pas toujours connue. Les procédures de permutation ont été proposées comme alternative, pour les modèles linéaires généralisés mixtes.L'objectif est de développer, pour l'analyse des effets centre dans un modèle de survie de Cox à effets mixtes, une procédure de test de permutation pour les effets aléatoires.La première partie du travail présente la procédure de permutation développée pour le test d'un unique effet centre sur le risque de base, avec une application à la recherche d'un effet centre dans un essai clinique chez des patients atteints de leucémie myéloïde aiguë. La seconde partie porte sur l'extension de la procédure au test d'effets aléatoires multiples afin d’étudier à la fois des effets centre sur le risque de base et sur l'effet de variables, avec des illustrations sur deux cohortes de patients atteints de leucémie aiguë. Dans une troisième partie, les méthodes proposées sont appliquées à une cohorte multicentrique de patients en réanimation atteints d'hémopathies malignes, pour étudier les facteurs déterminant les effets centre sur la mortalité hospitalière. Les procédures de permutation proposées constituent une approche robuste et d'implémentation relativement aisée pour le test, en routine, d'effets aléatoires, donc un outil adapté pour l'analyse d'effets centre en épidémiologie clinique, afin de comprendre leur origine. / Centre effects modelling within the framework of survival data often relies on the estimation of Cox mixed effects models. Testing for a centre effect consists in testing to zero the variance component of the corresponding random effect. In this framework, the identification of the null distribution of usual tests statistics is not always straightforward. Permutation procedures have been proposed as an alternative, for generalised linear mixed models.The objective was to develop a permutation test procedure for random effects in a Cox mixed effects model, for the test of centre effects.We first developed and evaluated permutation procedures for the test of a single centre effect on the baseline risk. The test was used to investigate a centre effect in a clinical trial of induction chemotherapy for patients with acute myeloid leukaemia.The second part consisted in extending the procedure for the test of multiple random effects, in survival models. The aim was to be able to examine both center effects on the baseline risk and centre effects on the effect of covariates. The procedure was illustrated on two cohorts of acute leukaemia patients. In a third part, the permutation approach was applied to a cohort of critically ill patients with hematologic malignancies, to investigate centre effects on the hospital mortality.The proposed permutation procedures appear to be robust approaches, easily implemented for the test of random centre effect in routine practice. They are an appropriate tool for the analysis of centre effects in clinical epidemiology, with the purpose of understanding their sources.
|
Page generated in 0.0721 seconds