Spelling suggestions: "subject:"curvival 2analysis"" "subject:"curvival 3analysis""
101 |
Risk and Predictive Factors for Liver Cancer : Analysis of Data from a Cohort StudySookthai, Disorn January 2011 (has links)
The association between the risk of liver cancer and blood chemistry was investigated in a cohort study with 95,150 men and women from two counties in Sweden. In 1963-65, blood tests and physical measurements were undertaken. All individuals were then followed up until 2007, and a total of 312 were diagnosed with liver cancer. Using survival analysis and logistic regression, significant risk factors were identified. Stepwise Cox proportional hazards regression applied to a main effect model revealed that Glutamic Pyruvate Transaminase (GPT) and Thymol Turbidity (TYM) were the most significant risk factors (p<0.0001), followed by Protein-Bound Hexoses (HEX) (p=0.002), sex (p=0.02), and Serum Iron (p= 0.03). Increasing the level of GPT expressed in U/L from normal (<21) to slightly elevated (21, 31) or substantially elevated (>31) raised the hazard of experiencing liver cancer by a factor of 1.45 and 4.09, respectively. In addition, GPT was found to be the most significant risk factor in almost all age groups among both men and women. However, there was no evidence that elevated GPT levels within the normal range (<21), influenced the risk of liver cancer. Additional subgroup analyses revealed that TYM was highly significant within the group with normal GPT, and a high level of HEX (≥134 mg/dl) increased the hazard 1.55 times in comparison with the lowest HEX group (<115 mg/dl). BMI was significant only in the male subgroup (p<0.01) and, in the obesity group, the hazard of experiencing liver cancer was 1.99 times higher than in the normal BMI group. A significant three-way interaction between GPT, BMI and gender was present (p=0.05) with a robust significant two-way interaction between GPT and BMI (p<0.01) in the male subgroup.
|
102 |
Statistical analysis of interval-censored and truncated survival data /Lim, Hee-Jeong, January 2001 (has links)
Thesis (Ph. D.)--University of Missouri-Columbia, 2001. / Typescript. Vita. Includes bibliographical references (leaves 112-115). Also available on the Internet.
|
103 |
Statistical analysis of interval-censored and truncated survival dataLim, Hee-Jeong, January 2001 (has links)
Thesis (Ph. D.)--University of Missouri-Columbia, 2001. / Typescript. Vita. Includes bibliographical references (leaves 112-115). Also available on the Internet.
|
104 |
Bayesian analysis and applications of a model for survival data with a surviving fraction /Tong, Qian, January 2002 (has links)
Thesis (M.A.S.)--Memorial University of Newfoundland, 2002. / Bibliography: leaves 79-80.
|
105 |
A collection of Bayesian models of stochastic failure processesKirschenmann, Thomas Harold 06 November 2013 (has links)
Risk managers currently seek new advances in statistical methodology to better forecast and quantify uncertainty. This thesis comprises a collection of new Bayesian models and computational methods which collectively aim to better estimate parameters and predict observables when data arise from stochastic failure processes. Such data commonly arise in reliability theory and survival analysis to predict failure times of mechanical devices, compare medical treatments, and to ultimately make well-informed risk management decisions. The collection of models proposed in this thesis advances the quality of those forecasts by providing computational modeling methodology to aid quantitative based decision makers. Through these models, a reliability expert will have the ability: to model how future decisions affect the process; to impose his prior beliefs on hazard rate shapes; to efficiently estimate parameters with MCMC methods; to incorporate exogenous information in the form of covariate data using Cox proportional hazard models; to utilize nonparametric priors for enhanced model flexibility. Managers are often forced to make decisions that affect the underlying distribution of a stochastic process. They regularly make these choices while lacking a mathematical model for how the process may itself depend significantly on their decisions. The first model proposed in this thesis provides a method to capture this decision dependency; this is used to make an optimal decision policy in the future, utilizing the interactions of the sequences of decisions. The model and method in this thesis is the first to directly estimate decision dependency in a stochastic process with the flexibility and power of the Bayesian formulation. The model parameters are estimated using an efficient Markov chain Monte Carlo technique, leading to predictive probability densities for the stochastic process. Using the posterior distributions of the random parameters in the model, a stochastic optimization program is solved to determine the sequence of decisions that minimise a cost-based objective function over a finite time horizon. The method is tested with artificial data and then used to model maintenance and failure time data from a condenser system at the South Texas Project Nuclear Operating Company (STPNOC). The second and third models proposed in this thesis offer a new way for survival analysts and reliability engineers to utilize their prior beliefs regarding the shape of hazard rate functions. Two generalizations of Weibull models have become popular recently, the exponentiated Weibull and the modified Weibull densities. The popularity of these models is largely due to the flexible hazard rate functions they can induce, such as bathtub, increasing, decreasing, and unimodal shaped hazard rates. These models are more complex than the standard Weibull, and without a Bayesian approach, one faces difficulties using traditional frequentist techniques to estimate the parameters. This thesis develops stylized families of prior distributions that should allow engineers to model their beliefs based on the context. Both models are first tested on artificial data and then compared when modeling a low pressure switch for a containment door at the STPNOC in Bay City, TX. Additionally, survival analysis is performed with these models using a famous collection of censored data about leukemia treatments. Two additional models are developed using the exponentiated and modified Weibull hazard functions as a baseline distribution to implement Cox proportional hazards models, allowing survival analysts to incorporate additional covariate information. Two nonparametric methods for estimating survival functions are compared using both simulated and real data from cancer treatment research. The quantile pyramid process is compared to Polya tree priors and is shown to have a distinct advantage due to the need for choosing a distribution upon which to center a Polya tree. The Polya tree and the quantile pyramid appear to have effectively the same accuracy when the Polya tree has a very well-informed choice of centering distribution. That is rarely the case, however, and one must conclude that the quantile pyramid process is at least as effective as Polya tree priors for modeling unknown situations. / text
|
106 |
Semiparametric analysis of interval censored survival dataLong, Yongxian., 龙泳先. January 2010 (has links)
published_or_final_version / Statistics and Actuarial Science / Master / Master of Philosophy
|
107 |
Landmark Prediction of SurvivalParast, Layla January 2012 (has links)
The importance of developing personalized risk prediction estimates has become increasingly evident in recent years. In general, patient populations may be heterogenous and represent a mixture of different unknown subtypes of disease. When the source of this heterogeneity and resulting subtypes of disease are unknown, accurate prediction of survival may be difficult. However, in certain disease settings the onset time of an observable intermediate event may be highly associated with these unknown subtypes of disease and thus may be useful in predicting long term survival. Throughout this dissertation, we examine an approach to incorporate intermediate event information for the prediction of long term survival: the landmark model. In Chapter 1, we use the landmark modeling framework to develop procedures to assess how a patient’s long term survival trajectory may change over time given good intermediate outcome indications along with prognosis based on baseline markers. We propose time-varying accuracy measures to quantify the predictive performance of landmark prediction rules for residual life and provide resampling-based procedures to make inference about such accuracy measures. We illustrate our proposed procedures using a breast cancer dataset. In Chapter 2, we aim to incorporate intermediate event time information for the prediction of survival. We propose a fully non-parametric procedure to incorporate intermediate event information when only a single baseline discrete covariate is available for prediction. When a continuous covariate or multiple covariates are available, we propose to incorporate intermediate event time information using a flexible varying coefficient model. To evaluate the performance of the resulting landmark prediction rule and quantify the information gained by using the intermediate event, we use robust non-parametric procedures. We illustrate these procedures using a dataset of post-dialysis patients with end-stage renal disease. In Chapter 3, we consider improving efficiency by incorporating intermediate event information in a randomized clinical trial setting. We propose a semi-nonparametric two-stage procedure to estimate survival by incorporating intermediate event information observed before the landmark time. In addition, we present a testing procedure using these resulting estimates to test for a difference in survival between two treatment groups. We illustrate these proposed procedures using an AIDS dataset.
|
108 |
Modelling multivariate interval-censored and left-truncated survival data using proportional hazards modelCheung, Tak-lun, Alan, 張德麟 January 2003 (has links)
published_or_final_version / abstract / toc / Statistics and Actuarial Science / Master / Master of Philosophy
|
109 |
Adaptive Integration into the Canadian Labour Market: The Case of Entrepreneur and Skilles Worker Immigrants2013 November 1900 (has links)
The literature review on immigrant’s self-employment activities has limited the debate around the leading factors to this type of activity. Much research on the subject has tried to answer the question ‘what are the determinant characteristics to become self-employed?’ In addressing that question researchers have focused on the relative value of the block mobility thesis and the ethnic enclave theory. This focus created a research gap; researchers have ignored how self-employment may be used by immigrants as an alternative or complementary strategy for accessing a new labour market. Using the Longitudinal Immigration Database, this research explores, using survival regression analysis, the extent to which immigrants adopt different labour market strategies following their admission to Canada. More specifically, it examines their rate of access to labour market activities, the length of time they stay in specific type of labour market activities and the determinant factors for such events.
The findings of this research demonstrate that 27 per cent of the economic immigrants, who were admitted to Canada between 1990 and 2008, are likely to rely on paid and self-employment activities simultaneously over time. This finding reinforces the need to analyse self-employment activity as a concurrent activity to paid employment. The regression analysis results on the concurrent activities imply that immigrants admitted under the self-employed category are more inclined, than the other economic immigrants, to rely on the two types of activities when integrating into the Canadian labour market. The findings of this thesis indicated that the traditional theories on self-employment activities are inadequate to explain concurrent self-employment activities and paid employment activities. There is a need to develop contemporary theories around this new concept of concurrent labour market activities that would take into consideration self-employment and employment theories as well as immigrants’ adaptive integration capacity.
|
110 |
Lietuvos įmonių gyvavimo trukmės ir bankroto analizė / An analysis of life time and bankruptcy of lithuanian companiesBaronas, Vaidotas 01 July 2014 (has links)
Šiame magistro darbe yra pateikiamas įmonės išgyvenamumą prognozuojantis Kokso proporcingųjų rizikų modelis. Pirmoje dalyje yra pateikiama trumpa modelių, susijusių su įmonių bankrotais ar nemokumu, apžvalga. Taip pat yra pristatomi balansinių ataskaitų ir įmonių finansų teoriniai pagrindai, įvairūs finansiniai rodikliai (įmonės pelnas/nuostolis, įsipareigojimai, turtas ir kt.) bei santykiniai finansiniai rodikliai (įmonės bendras skolos rodiklis, grynojo pelningumo koeficientas ir kt.). Be to, pateikiami išgyvenamumo analizės ir Kokso proporcingųjų rizikų modelio teoriniai pagrindai. Antroje darbo dalyje pateikiami empiriniai skaičiavimai. Čia parenkami statistiškai reikšmingi finansiniai ir santykiniai finansiniai rodikliai, įtakojantys įmonės gyvavimo trukmę, sudaromas bei tiriamas Kokso regresijos modelis, kuris vertina įmonės tikimybę išgyventi po tam tikro laiko. Ekonometrinė analizė atliekama naudojant SPSS, MS Excel ir SAS programinę įrangą. / This Master thesis develops survival of company prediction Cox proportional hazards model using the statistical methodology of Survival analysis. In the first part the short review of literature about models related with company’s bankruptcy, failure or financial distress is presented. Also the background of financial statement analysis, corporate finance, various financial variables (company’s profit/loss, liabilities, asset and other) and financial ratios (debt ratio, total asset turnover and other) are clarified. Furthermore, the theoretical background of Survival analysis and Cox proportional hazards model are presented. In the following part empirical computations are introduced. There are selected financial variables and financial ratios, which influence company’s survival and are statistically significant. Moreover, Cox regression model for estimating the probability of company’s survival after particular time is presented. Econometric analysis is performed using SPSS, MS Excel and SAS software.
|
Page generated in 0.052 seconds