Spelling suggestions: "subject:"accelerated failure time"" "subject:"daccelerated failure time""
1 |
Överlevnadsanalys i tjänsteverksamhet : Tidspåverkan i överklagandeprocessen på Migrationsverket / Survival analysis in service : Time-effect in the process of appeal at the Swedish Migration BoardMinya, Kristoffer January 2014 (has links)
Migrationsverket är en myndighet som prövar ansökningar från personer som vill söka skydd, ha medborgarskap, studera eller vill jobba i Sverige. Då det på senare tid varit en stor ökning i dessa ansökningar har tiden för vilket ett beslut tar ökat. Varje typ av ansökning (exempelvis medborgarskap) är en process som består av flera steg. Hur beslutet går igenom dessa steg kallas för flöde. Migrationsverket vill därför öka sin flödeseffektivitet. När beslutet är klart och personen tagit del av det men inte är nöjd kan denne överklaga. Detta är en av de mest komplexa processerna på Migrationsverket. Syftet är analysera hur lång tid denna process tar och vilka steg i processen som påverkar tiden. Ett steg (som senare visar sig ha en stor effekt på tiden) är yttranden. Det är när domstolen begär information om vad personen som överklagar har att säga om varför denne överklagar. För att analysera detta var två metoder relevanta, accelerated failure time (AFT) och \multi-state models (MSM). Den ena kan predicera tid till händelse (AFT) medan den andra kan analysera effekten av tidspåverkan (MSM) i stegen. Yttranden tidigt i processen har stor betydelse för hur snabbt en överklagan får en dom samtidigt som att antal yttranden ökar tiden enormt. Det finns andra faktorer som påverkar tiden men inte i så stor grad som yttranden. Då yttranden tidigt i processen samtidigt som antal yttranden har betydelse kan flödeseffektiviteten ökas med att ta tid på sig att skriva ett informativt yttrande som gör att domstolen inte behöver begära flera yttranden. / The Swedish Migration Board is an agency that review applications from individuals who wish to seek shelter, have citizenship, study or want to work in Sweden. In recent time there has been a large increase in applications and the time for which a decision is made has increased. Each type of application (such as citizenship) is a process consisting of several stages. How the decision is going through these steps is called flow. The Swedish Migration Board would therefore like to increase their flow efficiency. When the decision is made and the person has take part of it but is not satisfied, he can appeal. This is one of the most complex processes at the Board. The aim is to analyze how long this process will take and what steps in the process affects the time. One step (which was later found to have a significant effect on time) is opinions. This is when the court requests information on what the person is appealing has to say about why he is appealing. To analyze this, two methods were relevant, accelerated failure time (AFT) and the multi-state models (MSM). One can predict time to event (AFT), the other to analyze the effect of time-manipulation (MSM) in the flow. Opinions early in the process is crucial to how quickly an appeal get judgment while the number of opinions increases the time enormously. There are other factors that affect the time but not so much as opinions. The flow efficiency can be increased by taking time to write an informative opinion which allows the court need not to ask for more opinions.
|
2 |
A study of the robustness of Cox's proportional hazards model used in testing for covariate effectsFei, Mingwei January 1900 (has links)
Master of Arts / Department of Statistics / Paul Nelson / There are two important statistical models for multivariate survival analysis, proportional hazards(PH) models and accelerated failure time(AFT) model. PH analysis is most commonly used multivariate approach for analysing survival time data. For example, in clinical investigations
where several (known) quantities or covariates, potentially affect patient prognosis, it is often
desirable to investigate one factor effect adjust for the impact of others. This report offered a
solution to choose appropriate model in testing covariate effects under different situations. In real life, we are very likely to just have limited sample size and censoring rates(people dropping
off), which cause difficulty in statistical analysis. In this report, each dataset is randomly repeated 1000 times from three different distributions (Weibull, Lognormal and Loglogistc) with
combination of sample sizes and censoring rates. Then both models are evaluated by hypothesis testing of covariate effect using the simulated data using the derived statistics, power, type I
error rate and covergence rate for each situation. We would recommend PH method when
sample size is small(n<20) and censoring rate is high(p>0.8). In this case, both PH and AFT
analyses may not be suitable for hypothesis testing, but PH analysis is more robust and
consistent than AFT analysis. And when sample size is 20 or above and censoring rate is 0.8 or below, AFT analysis will have slight higher convergence rate and power than PH, but not much
improvement in Type I error rates when sample size is big(n>50) and censoring rate is low(p<0.3).
Considering the privilege of not requiring knowledge of distribution for PH analysis, we concluded
that PH analysis is robust in hypothesis testing for covariate effects using data generated from an AFT model.
|
3 |
Statistical inference in high dimensional linear and AFT modelsChai, Hao 01 July 2014 (has links)
Variable selection procedures for high dimensional data have been proposed and studied by a large amount of literature in the last few years. Most of the previous research focuses on the selection properties as well as the point estimation properties. In this paper, our goal is to construct the confidence intervals for some low-dimensional parameters in the high-dimensional setting. The models we study are the partially penalized linear and accelerated failure time models in the high-dimensional setting. In our model setup, all variables are split into two groups. The first group consists of a relatively small number of variables that are more interesting. The second group consists of a large amount of variables that can be potentially correlated with the response variable. We propose an approach that selects the variables from the second group and produces confidence intervals for the parameters in the first group. We show the sign consistency of the selection procedure and give a bound on the estimation error. Based on this result, we provide the sufficient conditions for the asymptotic normality of the low-dimensional parameters. The high-dimensional selection consistency and the low-dimensional asymptotic normality are developed for both linear and AFT models with high-dimensional data.
|
4 |
A Monte Carlo Approach to Change Point Detection in a Liver TransplantMakris, Alexia Melissa 01 January 2013 (has links)
Patient survival post liver transplant (LT) is important to both the patient and the center's accreditation, but over the years physicians have noticed that distant patients struggle with post LT care. I hypothesized that patient's distance from the transplant center had a detrimental effect on post LT survival. I suspected Hepatitis C (HCV) and Hepatocellular Carcinoma (HCC) patients would deteriorate due to their recurrent disease and there is a need for close monitoring post LT. From the current literature it was not clear if patients' distance from a transplant center affects outcomes post LT. Firozvi et al. (Firozvi AA, 2008) reported no difference in outcomes of LT recipients living 3 hours away or less. This study aimed to examine outcomes of LT recipients based on distance from a transplant center. I hypothesized that the effect of distance from a LT center was detrimental after adjusting for HCV and HCC status.
Methods:
This was a retrospective single center study of LT recipients transplanted between 1996 and 2012. 821 LT recipients were identified who qualified for inclusion in the study. Survival analysis was performed using standard methods as well as a newly developed Monte Carlo (MC) approach for change point detection. My new methodology, allowed for detection of both a change point in distance and a time by maximizing the two parameter score function (M2p) over a two dimensional grid of distance and time values. Extensive simulations using both standard distributions and data resembling the LT data structure were used to prove the functionality of the model.
Results:
Five year survival was 0.736 with a standard error of 0.018. Using Cox PH it was demonstrated that patients living beyond 180 miles had a hazard ratio (HR) of 2.68 (p-value<0.004) compared to those within 180 miles from the transplant center. I was able to confirm these results using KM and HCV/HCC adjusted AFT, while HCV and HCC adjusted LR confirmed the distance effect at 180 miles (p=0.0246), one year post LT. The new statistic that has been labeled M2p allows for simultaneous dichotomization of distance in conjunction with the identification of a change point in the hazard function. It performed much better than the previously available statistics in the standard simulations. The best model for the data was found to be extension 3 which dichotomizes the distance Z, replacing it by I(Z>c), and then estimates the change point c and tau.
Conclusions:
Distance had a detrimental effect and this effect was observed at 180 miles from the transplant center. Patients living beyond 180 miles from the transplant center had 2.68 times the death rate compared to those living within the 180 mile radius. Recipients with HCV fared the worst with the distance effect being more pronounced (HR of 3.72 vs. 2.68). Extensive simulations using different parameter values in both standard simulations and simulations resembling LT data, proved that these new approaches work for dichotomizing a continuous variable and finding a point beyond which there is an incremental effect from this variable. The recovered values were very close to the true values and p-values were small.
|
5 |
A Methodology for Estimating Business Interruption Losses to Industrial Sectors due to Flood Disasters / 洪水災害による産業部門の操業停止損失計量化に関する方法論的研究Lijiao, Yang 24 September 2015 (has links)
京都大学 / 0048 / 新制・課程博士 / 博士(情報学) / 甲第19340号 / 情博第592号 / 新制||情||103(附属図書館) / 32342 / 京都大学大学院情報学研究科社会情報学専攻 / (主査)教授 多々納 裕一, 教授 矢守 克也, 教授 守屋 和幸 / 学位規則第4条第1項該当 / Doctor of Informatics / Kyoto University / DGAM
|
6 |
Comparing methods for modeling longitudinal and survival data, with consideration of mediation analysisNgwa, Julius S. 14 March 2016 (has links)
Joint modeling of longitudinal and survival data has received much attention and is becoming increasingly useful. In clinical studies, longitudinal biomarkers are used to monitor disease progression and to predict survival. These longitudinal measures are often missing at failure times and may be prone to measurement errors. In previous studies these two types of data are frequently analyzed separately where a mixed effects model is used for longitudinal data and a survival model is applied to event outcomes. The argument in favor of a joint model has been the efficient use of the data as the survival information goes into modeling the longitudinal process and vice versa.
In this thesis, we present joint maximum likelihood methods, a two stage approach and time dependent covariate methods that link longitudinal data to survival data. First, we use simulation studies to explore and assess the performance of these methods with bias, accuracy and coverage probabilities. Then, we focus on four time dependent methods considering models that are unadjusted and adjusted for time. Finally, we consider mediation analysis for longitudinal and survival data. Mediation analysis is introduced and applied in a research framework based on genetic variants, longitudinal measures and disease risk. We implement accelerated failure time regression using the joint maximum likelihood approach (AFT-joint) and an accelerated failure time regression model using the observed longitudinal measures as time dependent covariates (AFT-observed) to assess the mediated effect.
We found that the two stage approach (TSA) performed best at estimating the link parameter. The joint maximum likelihood methods that used the predicted values of the longitudinal measures, similar to the TSA, provided larger estimates. The time dependent covariate methods that used the observed longitudinal measures in the survival analysis underestimated the true estimates. The mediation results showed that the AFT-joint and the AFT-observed underestimated the mediated effect. Comparison of the methods in Framingham Heart Study data revealed similar patterns. We recommend adjusting for time when estimating the association parameter in time dependent Cox and logistic models. Additional work is needed for estimating the mediated effect with longitudinal and survival data.
|
7 |
A Study of the Calibration Regression Model with Censored Lifetime Medical CostLu, Min 03 August 2006 (has links)
Medical cost has received increasing interest recently in Biostatistics and public health. Statistical analysis and inference of life time medical cost have been challenging by the fact that the survival times are censored on some study subjects and their subsequent cost are unknown. Huang (2002) proposed the calibration regression model which is a semiparametric regression tool to study the medical cost associated with covariates. In this thesis, an inference procedure is investigated using empirical likelihood ratio method. The unadjusted and adjusted empirical likelihood confidence regions are constructed for the regression parameters. We compare the proposed empirical likelihood methods with normal approximation based method. Simulation results show that the proposed empirical likelihood ratio method outperforms the normal approximation based method in terms of coverage probability. In particular, the adjusted empirical likelihood is the best one which overcomes the under coverage problem.
|
8 |
Differences in age at breeding between two genetically different populations of brown trout (Salmo trutta).Sjöström, Lars January 2019 (has links)
Survival analysis is an effective tool for conservation studies, since it measure the risk of an event that is important for the survival of populations and preservation of biodiversity. In this thesis three different models for survival analysis are used to estimate the age at breeding between two genetically different populations of brown trout. These populations are an evolutionary enigma, since they apparently coexist in direct competition with each other, which according to ecological theory should not happen. Thus it is of interest if differences between them can be identified. The data consists of brown trouts and has been collected over 20 years. The models are the Cox Proportional Hazards model, the Complementary Log-Log Link model and the Log Logistic Accelerated Failure-Time model. The Cox model were estimated in three different ways due to the nonproportional hazards in the estimates of time to breeding, which gave different interpretations of the same model. All of the models agree that the population B breed at younger ages than the population A, which suggests that the two populations have different reproductive strategies.
|
9 |
三要素混合模型於設限資料之願付價格分析 / A three-component mixture model in willingness-to-pay analysis for general interval censored data蔡依倫, Tsai,I-lun Unknown Date (has links)
在探討願付價格的條件評估法中一種常被使用的方法為“雙界二分選擇法”,並且一個隱含的假設是,所有研究對象皆願意支付一個合理的金額。然而對於某些商品,有些人也許願意支付任何金額;相對的,有些人可能不願意支付任何金額。分析願付價格時若不考慮這兩類極端反應者,則可能會得到一個偏誤的願付價格。本篇研究中,我們提出一個“混合模型”來處理此議題,其中以多元邏輯斯迴歸模型來描述不同反應者的比例,並以加速失敗時間模型來估計願意支付合理金額者其願付價格的分布。此外,我們以關於治療高血壓新藥之願付價格實例,作為實證分析。 / One commonly used method in contingent valuation (CV) survey for WTP (willingness-to-pay) is the “double-bound dichotomous choice approach” and an implicit assumption is that all study subjects are willing to pay a reasonable price. However, for certain goods, some subjects may be willing to pay any price for them, while some others may be unwilling to pay any price. Without considering these two types of the extreme respondents, a wrongly estimated WTP value will be obtained. We propose a “mixture model” to handle the issues in this study, in which a multinomial logistic model is taken to specify the proportions of different respondents and an accelerated failure time model is utilized to describe the distribution of WTP price for subjects who are willing to pay a reasonable price. In addition, an empirical example on WTP prices for a new hypertension treatment is provided to illustrate the proposed methods.
|
10 |
Estima??o em modelos de tempo de falha acelerado para dados de sobreviv?ncia correlacionadosSantos, Patr?cia Borchardt 01 December 2009 (has links)
Made available in DSpace on 2014-12-17T15:26:38Z (GMT). No. of bitstreams: 1
Patricia Borchardt Santos.pdf: 378137 bytes, checksum: e27ccc5c056aa17d7bd2ca2c8b64458f (MD5)
Previous issue date: 2009-12-01 / We presented in this work two methods of estimation for accelerated failure time models with random e_ects to process grouped survival data. The _rst method, which is implemented in software SAS, by NLMIXED procedure, uses an adapted Gauss-Hermite quadrature to determine marginalized likelihood. The second method, implemented in the free software R, is based on the method of penalized likelihood to estimate the parameters of the model. In the _rst case we describe the main theoretical aspects and, in the second, we briey presented the approach adopted with a simulation study to investigate the performance of the method. We realized implement the models using actual data on the time of operation of oil wells from the Potiguar Basin (RN / CE). / Apresentamos neste trabalho dois m?todos de estima??o para modelos de tempo de falha acelerado com efeito aleat?rio para tratar de dados de sobreviv?ncia correlacionados. O primeiro m?todo, que est? implementado no software SAS, atrav?s do procedimento NLMIXED, utiliza a quadratura Gauss-Hermite adaptada para obter a verossimilhan?a marginalizada. O segundo m?todo, implementado no software livre R, est? baseado no m?todo da verossimilhan?a penalizada para estimar os par?metros do modelo. No primeiro caso descrevemos os principais aspectos te?ricos e, no segundo, apresentamos brevemente a abordagem adotada juntamente com um estudo de simula??o para investigar a performance do m?todo. Realizamos uma aplica??o dos modelos usando dados reais sobre o tempo de funcionamento de po?os petrol?feros da Bacia Potiguar (RN/CE).
|
Page generated in 0.3558 seconds