• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 30
  • 7
  • 6
  • 4
  • 4
  • 4
  • 3
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 89
  • 89
  • 27
  • 21
  • 19
  • 16
  • 15
  • 14
  • 14
  • 13
  • 13
  • 12
  • 12
  • 11
  • 11
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
81

Some Inferential Results for One-Shot Device Testing Data Analysis

So, Hon Yiu January 2016 (has links)
In this thesis, we develop some inferential results for one-shot device testing data analysis. These extend and generalize existing methods in the literature. First, a competing-risk model is introduced for one-shot testing data under accelerated life-tests. One-shot devices are products which will be destroyed immediately after use. Therefore, we can observe only a binary status as data, success or failure, of such products instead of its lifetime. Many one-shot devices contain multiple components and failure of any one of them will lead to the failure of the device. Failed devices are inspected to identify the specific cause of failure. Since the exact lifetime is not observed, EM algorithm becomes a natural tool to obtain the maximum likelihood estimates of the model parameters. Here, we develop the EM algorithm for competing exponential and Weibull cases. Second, a semi-parametric approach is developed for simple one-shot device testing data. Semi-parametric estimation is a model that consists of parametric and non-parametric components. For this purpose, we only assume the hazards at different stress levels are proportional to each other, but no distributional assumption is made on the lifetimes. This provides a greater flexibility in model fitting and enables us to examine the relationship between the reliability of devices and the stress factors. Third, Bayesian inference is developed for one-shot device testing data under exponential distribution and Weibull distribution with non-constant shape parameters for competing risks. Bayesian framework provides statistical inference from another perspective. It assumes the model parameters to be random and then improves the inference by incorporating expert's experience as prior information. This method is shown to be very useful if we have limited failure observation wherein the maximum likelihood estimator may not exist. The thesis proceeds as follows. In Chapter 2, we assume the one-shot devices to have two components with lifetimes having exponential distributions with multiple stress factors. We then develop an EM algorithm for developing likelihood inference for the model parameters as well as some useful reliability characteristics. In Chapter 3, we generalize to the situation when lifetimes follow a Weibull distribution with non-constant shape parameters. In Chapter 4, we propose a semi-parametric model for simple one-shot device test data based on proportional hazards model and develop associated inferential results. In Chapter 5, we consider the competing risk model with exponential lifetimes and develop inference by adopting the Bayesian approach. In Chapter 6, we generalize these results on Bayesian inference to the situation when the lifetimes have a Weibull distribution. Finally, we provide some concluding remarks and indicate some future research directions in Chapter 7. / Thesis / Doctor of Philosophy (PhD)
82

Méthodes accélérées de Monte-Carlo pour la simulation d'événements rares. Applications aux Réseaux de Petri / Fast Monte Carlo methods for rare event simulation. Applications to Petri nets

Estecahandy, Maïder 18 April 2016 (has links)
Les études de Sûreté de Fonctionnement (SdF) sur les barrières instrumentées de sécurité représentent un enjeu important dans de nombreux domaines industriels. Afin de pouvoir réaliser ce type d'études, TOTAL développe depuis les années 80 le logiciel GRIF. Pour prendre en compte la complexité croissante du contexte opératoire de ses équipements de sécurité, TOTAL est de plus en plus fréquemment amené à utiliser le moteur de calcul MOCA-RP du package Simulation. MOCA-RP permet d'analyser grâce à la simulation de Monte-Carlo (MC) les performances d'équipements complexes modélisés à l'aide de Réseaux de Petri (RP). Néanmoins, obtenir des estimateurs précis avec MC sur des équipements très fiables, tels que l'indisponibilité, revient à faire de la simulation d'événements rares, ce qui peut s'avérer être coûteux en temps de calcul. Les méthodes standard d'accélération de la simulation de Monte-Carlo, initialement développées pour répondre à cette problématique, ne semblent pas adaptées à notre contexte. La majorité d'entre elles ont été définies pour améliorer l'estimation de la défiabilité et/ou pour les processus de Markov. Par conséquent, le travail accompli dans cette thèse se rapporte au développement de méthodes d'accélération de MC adaptées à la problématique des études de sécurité se modélisant en RP et estimant notamment l'indisponibilité. D'une part, nous proposons l'Extension de la Méthode de Conditionnement Temporel visant à accélérer la défaillance individuelle des composants. D'autre part, la méthode de Dissociation ainsi que la méthode de ``Truncated Fixed Effort'' ont été introduites pour accroitre l'occurrence de leurs défaillances simultanées. Ensuite, nous combinons la première technique avec les deux autres, et nous les associons à la méthode de Quasi-Monte-Carlo randomisée. Au travers de diverses études de sensibilité et expériences numériques, nous évaluons leur performance, et observons une amélioration significative des résultats par rapport à MC. Par ailleurs, nous discutons d'un sujet peu familier à la SdF, à savoir le choix de la méthode à utiliser pour déterminer les intervalles de confiance dans le cas de la simulation d'événements rares. Enfin, nous illustrons la faisabilité et le potentiel de nos méthodes sur la base d'une application à un cas industriel. / The dependability analysis of safety instrumented systems is an important industrial concern. To be able to carry out such safety studies, TOTAL develops since the eighties the dependability software GRIF. To take into account the increasing complexity of the operating context of its safety equipment, TOTAL is more frequently led to use the engine MOCA-RP of the GRIF Simulation package. Indeed, MOCA-RP allows to estimate quantities associated with complex aging systems modeled in Petri nets thanks to the standard Monte Carlo (MC) simulation. Nevertheless, deriving accurate estimators, such as the system unavailability, on very reliable systems involves rare event simulation, which requires very long computing times with MC. In order to address this issue, the common fast Monte Carlo methods do not seem to be appropriate. Many of them are originally defined to improve only the estimate of the unreliability and/or well-suited for Markovian processes. Therefore, the work accomplished in this thesis pertains to the development of acceleration methods adapted to the problematic of performing safety studies modeled in Petri nets and estimating in particular the unavailability. More specifically, we propose the Extension of the "Méthode de Conditionnement Temporel" to accelerate the individual failure of the components, and we introduce the Dissociation Method as well as the Truncated Fixed Effort Method to increase the occurrence of their simultaneous failures. Then, we combine the first technique with the two other ones, and we also associate them with the Randomized Quasi-Monte Carlo method. Through different sensitivities studies and benchmark experiments, we assess the performance of the acceleration methods and observe a significant improvement of the results compared with MC. Furthermore, we discuss the choice of the confidence interval method to be used when considering rare event simulation, which is an unfamiliar topic in the field of dependability. Last, an application to an industrial case permits the illustration of the potential of our solution methodology.
83

可加性模型與拔靴法在臺灣地區小型商用車市場需求之應用研究

呂明哲, Lu, Ming Che Unknown Date (has links)
本文採用可加性模型分析法建立台灣地區小型商用車市場之需求模型,並 引進Box-Jenkins時間序列模型處理具自我相關之誤差項,以利進行拔靴 推論設計時,能拔靴白干擾(bootstrapping white noise),即重抽樣白 干擾的經驗分配。在此次研究過程中,除配適Box-Jenkins時間序列模型 外,所有分析步驟都是完全自動的,不須作假設和檢驗的工作,所以可降 低傳統上因統計人員主觀判斷錯誤所造成的估計偏誤。可加性模型改進傳 統迴歸模型須先假設模型形式的限制,可從商用車實證分析中,直接由資 料配適平滑函數,顯見其合理性。拔靴法免除傳統推論程序須強使隨機干 擾項分配為常態分配或漸近常態分配之束縛,改由殘差經驗分配模擬隨機 干擾項分配行為,在推論商用車市場上,也獲得不錯的結果。
84

Efron’s Method on Large Scale Correlated Data and Its Refinements

Ghoshal, Asmita 11 August 2023 (has links)
No description available.
85

Application Of The Empirical Likelihood Method In Proportional Hazards Model

He, Bin 01 January 2006 (has links)
In survival analysis, proportional hazards model is the most commonly used and the Cox model is the most popular. These models are developed to facilitate statistical analysis frequently encountered in medical research or reliability studies. In analyzing real data sets, checking the validity of the model assumptions is a key component. However, the presence of complicated types of censoring such as double censoring and partly interval-censoring in survival data makes model assessment difficult, and the existing tests for goodness-of-fit do not have direct extension to these complicated types of censored data. In this work, we use empirical likelihood (Owen, 1988) approach to construct goodness-of-fit test and provide estimates for the Cox model with various types of censored data. Specifically, the problems under consideration are the two-sample Cox model and stratified Cox model with right censored data, doubly censored data and partly interval-censored data. Related computational issues are discussed, and some simulation results are presented. The procedures developed in the work are applied to several real data sets with some discussion.
86

Some Contributions to Inferential Issues of Censored Exponential Failure Data

Han, Donghoon 06 1900 (has links)
In this thesis, we investigate several inferential issues regarding the lifetime data from exponential distribution under different censoring schemes. For reasons of time constraint and cost reduction, censored sampling is commonly employed in practice, especially in reliability engineering. Among various censoring schemes, progressive Type-I censoring provides not only the practical advantage of known termination time but also greater flexibility to the experimenter in the design stage by allowing for the removal of test units at non-terminal time points. Hence, we first consider the inference for a progressively Type-I censored life-testing experiment with k uniformly spaced intervals. For small to moderate sample sizes, a practical modification is proposed to the censoring scheme in order to guarantee a feasible life-test under progressive Type-I censoring. Under this setup, we obtain the maximum likelihood estimator (MLE) of the unknown mean parameter and derive the exact sampling distribution of the MLE through the use of conditional moment generating function under the condition that the existence of the MLE is ensured. Using the exact distribution of the MLE as well as its asymptotic distribution and the parametric bootstrap method, we discuss the construction of confidence intervals for the mean parameter and their performance is then assessed through Monte Carlo simulations. Next, we consider a special class of accelerated life tests, known as step-stress tests in reliability testing. In a step-stress test, the stress levels increase discretely at pre-fixed time points and this allows the experimenter to obtain information on the parameters of the lifetime distributions more quickly than under normal operating conditions. Here, we consider a k-step-stress accelerated life testing experiment with an equal step duration τ. In particular, the case of progressively Type-I censored data with a single stress variable is investigated. For small to moderate sample sizes, we introduce another practical modification to the model for a feasible k-step-stress test under progressive censoring, and the optimal τ is searched using the modified model. Next, we seek the optimal τ under the condition that the step-stress test proceeds to the k-th stress level, and the efficiency of this conditional inference is compared to the preceding models. In all cases, censoring is allowed at each change stress point iτ, i = 1, 2, ... , k, and the problem of selecting the optimal Tis discussed using C-optimality, D-optimality, and A-optimality criteria. Moreover, when a test unit fails, there are often more than one fatal cause for the failure, such as mechanical or electrical. Thus, we also consider the simple stepstress models under Type-I and Type-II censoring situations when the lifetime distributions corresponding to the different risk factors are independently exponentially distributed. Under this setup, we derive the MLEs of the unknown mean parameters of the different causes under the assumption of a cumulative exposure model. The exact distributions of the MLEs of the parameters are then derived through the use of conditional moment generating functions. Using these exact distributions as well as the asymptotic distributions and the parametric bootstrap method, we discuss the construction of confidence intervals for the parameters and then assess their performance through Monte Carlo simulations. / Thesis / Doctor of Philosophy (PhD)
87

壽險公司責任準備金涉險值之估計 / The Estimation of Value at Risk for the Reserve of Life/Health Insurance Company

詹志清, Chihching Chan Unknown Date (has links)
中文摘要 在本文中,我們依據模擬的風險因子變動,包括死亡率風險,利率風險,解約率風險以及模型的參數風險,來估計第一個保單年度的期末責任準備金之涉險值 (Value at Risk)。本文中,雖僅計算生死合險保單的準備金之涉險值,但是本文所提供的方法以及計算過程可以很容易的應用到其它險種,甚至配合資產面的考量來計算保險公司盈餘(Surplus)的涉險值,進而作為清償能力的監測系統。 本文的特點包括下列幾項:第一,本文提供了一個不同於傳統短期間(Short Horizon)的涉險值計算方式,來估計壽險商品的保單責任準備金(Policy Reserve)的涉險值。第二,本文利用生命表來估計死亡率風險所造成的涉險值。第三,我們利用隨機利率模型來捕捉隨機利率對於責任準備金涉險值的影響。第四,我們考慮解約率對於責任準備金涉險值的影響,值得注意的是,在我們的解約率模型中,引入的利率對於解約率的影響。第五,本文亦考慮風險因子模型當中的參數風險對於涉險值的影響。最後,我們利用無母數方法計算出涉險值的信賴區間,而信賴區間的估計在模擬過程當中尤其重要,因為它可以用來決定模擬次數的多寡。 本文包含六節:第一節為導論。第二節為計算死亡率風險的責任準備金涉險值。第三節是計算加上利率風險後責任準備金涉險值的變化。第四節則為加上解約率後對涉險值的影響。第五節為計算涉險值的信賴區間。第六節是我們的結論以及後續研究的方向探討。 本文包含六節:第一節為導論。第二節為計算死亡率風險的責任準備金涉險值。第三節是計算加上利率風險後責任準備金涉險值的變化。第四節則為加上解約率後對涉險值的影響。第五節為計算涉險值的信賴區間。第六節是我們的結論以及後續研究的方向探討。 / ABSTRACT In this paper, we estimate the VAR of life insurer's terminal reserve of the first policy year by the simulated risk factors, including mortality risk, interest rate risk, lapse rate risk, and estimation risks, of future twenty years. We found that the difference between the VAR under the mortality risk and the interest rate risk is very large because interest rate is a stochastic process but not mortality rate. Thus, the dispersion of interest rate is more then mortality rate. In addition, the VAR will reduce a lot after adding the impact of lapses because the duration of the reserve reduced. If we neglect the impact of lapses to VAR, we will overestimate the VAR significantly. The features of this paper are as follows. First, we provide an approach to measure the VAR of a life insurer's reserve, and it is rather different from traditional VAR with short horizons. Second, we use mortality table to estimate the VAR of a life insurer's reserve. Third, we use stochastic interest rate model to capture the effect of random interest rate to the VAR of a life insurer's reserve. Fourth, we relate the future cash outflows to interest rate and produce a reasonable estimator of VAR. Fifth, we consider the effect of estimation errors to the VAR of a life insurer's reserve. Last, we calculate the confidence interval of the VAR estimates of the policy reserves. This paper consists of six sections. The first section is an introduction. In the second section, we present the method used to estimate the variance of the mortality rate and then estimate the VAR of reserves from these variances. In the third section, we explore how to use stochastic interest rate model to estimate the reserve's VAR and the VAR associated with the parameter risk of the interest rate model. In the fourth section, we analyze the contribution of the lapse rate risk and the parameter risk of the lapse rate model to the reserve's VAR. We also analyze the relative significance of the interest rate risk, the lapse rate risk, and the mortality rate risk in terms of their marginal contributions to the VAR of an insurer's reserves in this section. In the fifth section, we calculate the confidence intervals of the VAR estimates discussed in the previous sections. The last section is the conclusion section containing our conclusions and discussions about potential future researches.
88

Sur les familles des lois de fonction de hasard unimodale : applications en fiabilité et analyse de survie

Saaidia, Noureddine 24 June 2013 (has links)
En fiabilité et en analyse de survie, les distributions qui ont une fonction de hasard unimodale ne sont pas nombreuses, qu'on peut citer: Gaussienne inverse ,log-normale, log-logistique, de Birnbaum-Saunders, de Weibull exponentielle et de Weibullgénéralisée. Dans cette thèse, nous développons les tests modifiés du Chi-deux pour ces distributions tout en comparant la distribution Gaussienne inverse avec les autres. Ensuite nousconstruisons le modèle AFT basé sur la distribution Gaussienne inverse et les systèmes redondants basés sur les distributions de fonction de hasard unimodale. / In reliability and survival analysis, distributions that have a unimodalor $\cap-$shape hazard rate function are not too many, they include: the inverse Gaussian,log-normal, log-logistic, Birnbaum-Saunders, exponential Weibull and power generalized Weibulldistributions. In this thesis, we develop the modified Chi-squared tests for these distributions,and we give a comparative study between the inverse Gaussian distribution and the otherdistributions, then we realize simulations. We also construct the AFT model based on the inverseGaussian distribution and redundant systems based on distributions having a unimodal hazard ratefunction.
89

Statistical Inference

Chou, Pei-Hsin 26 June 2008 (has links)
In this paper, we will investigate the important properties of three major parts of statistical inference: point estimation, interval estimation and hypothesis testing. For point estimation, we consider the two methods of finding estimators: moment estimators and maximum likelihood estimators, and three methods of evaluating estimators: mean squared error, best unbiased estimators and sufficiency and unbiasedness. For interval estimation, we consider the the general confidence interval, confidence interval in one sample, confidence interval in two samples, sample sizes and finite population correction factors. In hypothesis testing, we consider the theory of testing of hypotheses, testing in one sample, testing in two samples, and the three methods of finding tests: uniformly most powerful test, likelihood ratio test and goodness of fit test. Many examples are used to illustrate their applications.

Page generated in 0.0661 seconds