• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 95
  • 37
  • 26
  • 17
  • 10
  • 8
  • 7
  • 4
  • 3
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 226
  • 226
  • 73
  • 68
  • 67
  • 51
  • 44
  • 42
  • 39
  • 32
  • 31
  • 29
  • 27
  • 23
  • 22
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
221

CURE RATE AND DESTRUCTIVE CURE RATE MODELS UNDER PROPORTIONAL ODDS LIFETIME DISTRIBUTIONS

FENG, TIAN January 2019 (has links)
Cure rate models, introduced by Boag (1949), are very commonly used while modelling lifetime data involving long time survivors. Applications of cure rate models can be seen in biomedical science, industrial reliability, finance, manufacturing, demography and criminology. In this thesis, cure rate models are discussed under a competing cause scenario, with the assumption of proportional odds (PO) lifetime distributions for the susceptibles, and statistical inferential methods are then developed based on right-censored data. In Chapter 2, a flexible cure rate model is discussed by assuming the number of competing causes for the event of interest following the Conway-Maxwell (COM) Poisson distribution, and their corresponding lifetimes of non-cured or susceptible individuals can be described by PO model. This provides a natural extension of the work of Gu et al. (2011) who had considered a geometric number of competing causes. Under right censoring, maximum likelihood estimators (MLEs) are obtained by the use of expectation-maximization (EM) algorithm. An extensive Monte Carlo simulation study is carried out for various scenarios, and model discrimination between some well-known cure models like geometric, Poisson and Bernoulli is also examined. The goodness-of-fit and model diagnostics of the model are also discussed. A cutaneous melanoma dataset example is used to illustrate the models as well as the inferential methods. Next, in Chapter 3, the destructive cure rate models, introduced by Rodrigues et al. (2011), are discussed under the PO assumption. Here, the initial number of competing causes is modelled by a weighted Poisson distribution with special focus on exponentially weighted Poisson, length-biased Poisson and negative binomial distributions. Then, a damage distribution is introduced for the number of initial causes which do not get destroyed. An EM-type algorithm for computing the MLEs is developed. An extensive simulation study is carried out for various scenarios, and model discrimination between the three weighted Poisson distributions is also examined. All the models and methods of estimation are evaluated through a simulation study. A cutaneous melanoma dataset example is used to illustrate the models as well as the inferential methods. In Chapter 4, frailty cure rate models are discussed under a gamma frailty wherein the initial number of competing causes is described by a Conway-Maxwell (COM) Poisson distribution in which the lifetimes of non-cured individuals can be described by PO model. The detailed steps of the EM algorithm are then developed for this model and an extensive simulation study is carried out to evaluate the performance of the proposed model and the estimation method. A cutaneous melanoma dataset as well as a simulated data are used for illustrative purposes. Finally, Chapter 5 outlines the work carried out in the thesis and also suggests some problems of further research interest. / Thesis / Doctor of Philosophy (PhD)
222

Inference for Gamma Frailty Models based on One-shot Device Data

Yu, Chenxi January 2024 (has links)
A device that is accompanied by an irreversible chemical reaction or physical destruction and could no longer function properly after performing its intended function is referred to as a one-shot device. One-shot device test data differ from typical data obtained by measuring lifetimes in standard life-tests. Due to the very nature of one-shot devices, actual lifetimes of one-shot devices under test cannot be observed, and they are either left- or right-censored. In addition, a one-shot device often has multiple components that could cause the failure of the device. The components are coupled together in the manufacturing process or assembly, resulting in the failure modes possessing latent heterogeneity and dependence. Frailty models enable us to describe the influence of common, but unobservable covariates, on the hazard function as a random effect in a model and also provide an easily understandable interpretation. In this thesis, we develop some inferential results for one-shot device testing data with gamma frailty model. We first develop an efficient expectation-maximization (EM) algorithm for determining the maximum likelihood estimates of model parameters of a gamma frailty model with exponential lifetime distributions for components based on one-shot device test data with multiple failure modes, wherein the data are obtained from a constant-stress accelerated life-test. The maximum likelihood estimate of the mean lifetime of $k$-out-of-$M$ structured one-shot devices under normal operating conditions is also presented. In addition, the asymptotic variance–covariance matrix of the maximum likelihood estimates is derived, which is then used to construct asymptotic confidence intervals for the model parameters. The performance of the proposed inferential methods is finally evaluated through Monte Carlo simulations and then illustrated with a numerical example. A gamma frailty model with Weibull baseline hazards is considered next for fitting one-shot device testing data. The Weibull baseline hazards enable us to analyze time-varying failure rates more accurately, allowing for a deeper understanding of the dynamic nature of system's reliability. We develop an EM algorithm for estimating the model parameters utilizing the complete likelihood function. A detailed simulation study evaluates the performance of the Weibull baseline hazard model with that of the exponential baseline hazard model. The introduction of shape parameters in the component's lifetime distribution within the Weibull baseline hazard model offers enhanced flexibility in model fitting. Finally, Bayesian inference is then developed for the gamma frailty model with exponential baseline hazard for one-shot device testing data. We introduce the Bayesian estimation procedure using Markov chain Monte Carlo (MCMC) technique for estimating the model parameters as well as for developing credible intervals for those parameters. The performance of the proposed method is evaluated in a simulation study. Model comparison between independence model and the frailty model is made using Bayesian model selection criterion. / Thesis / Candidate in Philosophy
223

Confiabilidade em sistemas coerentes: um modelo bayesiano Weibull. / Reliability in coherent systems: a bayesian weibull model

Bhering, Felipe Lunardi 28 June 2013 (has links)
O principal objetivo desse trabalho é introduzir um modelo geral bayesiano Weibull hierárquico para dados censurados que estima a função de confiabilidade de cada componente para sistemas de confiabilidade coerentes. São introduzidos formas de estimação mais sólidas, sem a inserção de estimativas médias nas funções de confiabilidade (estimador plug-in). Através desse modelo, são expostos e solucionados exemplos na área de confiabilidade como sistemas em série, sistemas em paralelo, sistemas k-de-n, sistemas bridge e um estudo clínico com dados censurados intervalares. As soluções consideram que as componentes tem diferentes distribuições, e nesse caso, o sistema bridge ainda não havia solução na literatura. O modelo construído é geral e pode ser utilizado para qualquer sistema coerente e não apenas para dados da área de confiabilidade, como também na área de sobrevivência, dentre outros. Diversas simulações com componentes com diferentes proporções de censura, distintas médias, três tipos de distribuições e tamanhos de amostra foram feitas em todos os sistemas para avaliar a eficácia do modelo. / The main purpose of this work is to introduce a general bayesian Weibull hierarchical model for censored data which estimates each reliability components function from coherent systems. Its introduced estimation procedures which do not consider plug-in estimators. Also, its exposed and solved with this model examples in reliability area such as series systems, parallel systems, k-out-of-n systems, bridge systems and a clinical study with interval censoring data. The problem of bridge system hadnt a solution before for the case of each component with different distribution. Actually, this model is general and can be used to analyse any kind of coherent system and censored data, not only reliability ones, but also survival data and others. Several components simulations with different censored proportions, distinct means, three kinds of distributions and sample size were made in all systems to evaluate model efficiency.
224

狀態轉換下利率與跳躍風險股票報酬之歐式選擇權評價與實證分析 / Option Pricing and Empirical Analysis for Interest Rate and Stock Index Return with Regime-Switching Model and Dependent Jump Risks

巫柏成, Wu, Po Cheng Unknown Date (has links)
Chen, Chang, Wen and Lin (2013)提出馬可夫調控跳躍過程模型(MMJDM)描述股價指數報酬率,布朗運動項、跳躍項之頻率與市場狀態有關。然而,利率並非常數,本論文以狀態轉換模型配適零息債劵之動態過程,提出狀態轉換下的利率與具跳躍風險的股票報酬之二維模型(MMJDMSI),並以1999年至2013年的道瓊工業指數與S&P 500指數和同期間之一年期美國國庫劵價格為實證資料,採用EM演算法取得參數估計值。經由概似比檢定結果顯示無論道瓊工業指數還是S&P 500指數,狀態轉換下利率與跳躍風險之股票報酬二維模型更適合描述報酬率。接著,利用Esscher轉換法推導出各模型下的股價指數之歐式買權定價公式,再對MMJDMSI模型進行敏感度分析以評估模型參數發生變動時對於定價公式的影響。最後,以實證資料對各模型進行模型校準及計算隱含波動度,結果顯示MMJDMSI在價內及價外時定價誤差為最小或次小,且此模型亦能呈現出波動度微笑曲線之現象。 / To model asset return, Chen, Chang, Wen and Lin (2013) proposed Markov-Modulated Jump Diffusion Model (MMJDM) assuming that the Brownian motion term and jump frequency are all related to market states. In fact, the interest rate is not constant, Regime-Switching Model is taken to fit the process of the zero-coupon bond price, and a bivariate model for interest rate and stock index return with regime-switching and dependent jump risks (MMJDMSI) is proposed. The empirical data are Dow Jones Industrial Average and S&P 500 Index from 1999 to 2013, together with US 1-Year Treasury Bond over the same period. Model parameters are estimated by the Expectation-Maximization (EM) algorithm. The likelihood ratio test (LRT) is performed to compare nested models, and MMJDMSI is better than the others. Then, European call option pricing formula under each model is derived via Esscher transformation, and sensitivity analysis is conducted to evaluate changes resulted from different parameter values under the MMJDMSI pricing formula. Finally, model calibrations are performed and implied volatilities are computed under each model empirically. In cases of in-the-money and out-the-money, MMJDMSI has either the smallest or the second smallest pricing error. Also, the implied volatilities from MMJDMSI display a volatility smile curve.
225

Confiabilidade em sistemas coerentes: um modelo bayesiano Weibull. / Reliability in coherent systems: a bayesian weibull model

Felipe Lunardi Bhering 28 June 2013 (has links)
O principal objetivo desse trabalho é introduzir um modelo geral bayesiano Weibull hierárquico para dados censurados que estima a função de confiabilidade de cada componente para sistemas de confiabilidade coerentes. São introduzidos formas de estimação mais sólidas, sem a inserção de estimativas médias nas funções de confiabilidade (estimador plug-in). Através desse modelo, são expostos e solucionados exemplos na área de confiabilidade como sistemas em série, sistemas em paralelo, sistemas k-de-n, sistemas bridge e um estudo clínico com dados censurados intervalares. As soluções consideram que as componentes tem diferentes distribuições, e nesse caso, o sistema bridge ainda não havia solução na literatura. O modelo construído é geral e pode ser utilizado para qualquer sistema coerente e não apenas para dados da área de confiabilidade, como também na área de sobrevivência, dentre outros. Diversas simulações com componentes com diferentes proporções de censura, distintas médias, três tipos de distribuições e tamanhos de amostra foram feitas em todos os sistemas para avaliar a eficácia do modelo. / The main purpose of this work is to introduce a general bayesian Weibull hierarchical model for censored data which estimates each reliability components function from coherent systems. Its introduced estimation procedures which do not consider plug-in estimators. Also, its exposed and solved with this model examples in reliability area such as series systems, parallel systems, k-out-of-n systems, bridge systems and a clinical study with interval censoring data. The problem of bridge system hadnt a solution before for the case of each component with different distribution. Actually, this model is general and can be used to analyse any kind of coherent system and censored data, not only reliability ones, but also survival data and others. Several components simulations with different censored proportions, distinct means, three kinds of distributions and sample size were made in all systems to evaluate model efficiency.
226

Joint models for longitudinal and survival data

Yang, Lili 11 July 2014 (has links)
Indiana University-Purdue University Indianapolis (IUPUI) / Epidemiologic and clinical studies routinely collect longitudinal measures of multiple outcomes. These longitudinal outcomes can be used to establish the temporal order of relevant biological processes and their association with the onset of clinical symptoms. In the first part of this thesis, we proposed to use bivariate change point models for two longitudinal outcomes with a focus on estimating the correlation between the two change points. We adopted a Bayesian approach for parameter estimation and inference. In the second part, we considered the situation when time-to-event outcome is also collected along with multiple longitudinal biomarkers measured until the occurrence of the event or censoring. Joint models for longitudinal and time-to-event data can be used to estimate the association between the characteristics of the longitudinal measures over time and survival time. We developed a maximum-likelihood method to joint model multiple longitudinal biomarkers and a time-to-event outcome. In addition, we focused on predicting conditional survival probabilities and evaluating the predictive accuracy of multiple longitudinal biomarkers in the joint modeling framework. We assessed the performance of the proposed methods in simulation studies and applied the new methods to data sets from two cohort studies. / National Institutes of Health (NIH) Grants R01 AG019181, R24 MH080827, P30 AG10133, R01 AG09956.

Page generated in 0.4764 seconds