• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 43
  • 33
  • 5
  • 5
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 116
  • 116
  • 30
  • 20
  • 20
  • 20
  • 18
  • 17
  • 15
  • 15
  • 14
  • 14
  • 14
  • 13
  • 13
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
111

CURE RATE AND DESTRUCTIVE CURE RATE MODELS UNDER PROPORTIONAL ODDS LIFETIME DISTRIBUTIONS

FENG, TIAN January 2019 (has links)
Cure rate models, introduced by Boag (1949), are very commonly used while modelling lifetime data involving long time survivors. Applications of cure rate models can be seen in biomedical science, industrial reliability, finance, manufacturing, demography and criminology. In this thesis, cure rate models are discussed under a competing cause scenario, with the assumption of proportional odds (PO) lifetime distributions for the susceptibles, and statistical inferential methods are then developed based on right-censored data. In Chapter 2, a flexible cure rate model is discussed by assuming the number of competing causes for the event of interest following the Conway-Maxwell (COM) Poisson distribution, and their corresponding lifetimes of non-cured or susceptible individuals can be described by PO model. This provides a natural extension of the work of Gu et al. (2011) who had considered a geometric number of competing causes. Under right censoring, maximum likelihood estimators (MLEs) are obtained by the use of expectation-maximization (EM) algorithm. An extensive Monte Carlo simulation study is carried out for various scenarios, and model discrimination between some well-known cure models like geometric, Poisson and Bernoulli is also examined. The goodness-of-fit and model diagnostics of the model are also discussed. A cutaneous melanoma dataset example is used to illustrate the models as well as the inferential methods. Next, in Chapter 3, the destructive cure rate models, introduced by Rodrigues et al. (2011), are discussed under the PO assumption. Here, the initial number of competing causes is modelled by a weighted Poisson distribution with special focus on exponentially weighted Poisson, length-biased Poisson and negative binomial distributions. Then, a damage distribution is introduced for the number of initial causes which do not get destroyed. An EM-type algorithm for computing the MLEs is developed. An extensive simulation study is carried out for various scenarios, and model discrimination between the three weighted Poisson distributions is also examined. All the models and methods of estimation are evaluated through a simulation study. A cutaneous melanoma dataset example is used to illustrate the models as well as the inferential methods. In Chapter 4, frailty cure rate models are discussed under a gamma frailty wherein the initial number of competing causes is described by a Conway-Maxwell (COM) Poisson distribution in which the lifetimes of non-cured individuals can be described by PO model. The detailed steps of the EM algorithm are then developed for this model and an extensive simulation study is carried out to evaluate the performance of the proposed model and the estimation method. A cutaneous melanoma dataset as well as a simulated data are used for illustrative purposes. Finally, Chapter 5 outlines the work carried out in the thesis and also suggests some problems of further research interest. / Thesis / Doctor of Philosophy (PhD)
112

Inference for Gamma Frailty Models based on One-shot Device Data

Yu, Chenxi January 2024 (has links)
A device that is accompanied by an irreversible chemical reaction or physical destruction and could no longer function properly after performing its intended function is referred to as a one-shot device. One-shot device test data differ from typical data obtained by measuring lifetimes in standard life-tests. Due to the very nature of one-shot devices, actual lifetimes of one-shot devices under test cannot be observed, and they are either left- or right-censored. In addition, a one-shot device often has multiple components that could cause the failure of the device. The components are coupled together in the manufacturing process or assembly, resulting in the failure modes possessing latent heterogeneity and dependence. Frailty models enable us to describe the influence of common, but unobservable covariates, on the hazard function as a random effect in a model and also provide an easily understandable interpretation. In this thesis, we develop some inferential results for one-shot device testing data with gamma frailty model. We first develop an efficient expectation-maximization (EM) algorithm for determining the maximum likelihood estimates of model parameters of a gamma frailty model with exponential lifetime distributions for components based on one-shot device test data with multiple failure modes, wherein the data are obtained from a constant-stress accelerated life-test. The maximum likelihood estimate of the mean lifetime of $k$-out-of-$M$ structured one-shot devices under normal operating conditions is also presented. In addition, the asymptotic variance–covariance matrix of the maximum likelihood estimates is derived, which is then used to construct asymptotic confidence intervals for the model parameters. The performance of the proposed inferential methods is finally evaluated through Monte Carlo simulations and then illustrated with a numerical example. A gamma frailty model with Weibull baseline hazards is considered next for fitting one-shot device testing data. The Weibull baseline hazards enable us to analyze time-varying failure rates more accurately, allowing for a deeper understanding of the dynamic nature of system's reliability. We develop an EM algorithm for estimating the model parameters utilizing the complete likelihood function. A detailed simulation study evaluates the performance of the Weibull baseline hazard model with that of the exponential baseline hazard model. The introduction of shape parameters in the component's lifetime distribution within the Weibull baseline hazard model offers enhanced flexibility in model fitting. Finally, Bayesian inference is then developed for the gamma frailty model with exponential baseline hazard for one-shot device testing data. We introduce the Bayesian estimation procedure using Markov chain Monte Carlo (MCMC) technique for estimating the model parameters as well as for developing credible intervals for those parameters. The performance of the proposed method is evaluated in a simulation study. Model comparison between independence model and the frailty model is made using Bayesian model selection criterion. / Thesis / Candidate in Philosophy
113

Confiabilidade em sistemas coerentes: um modelo bayesiano Weibull. / Reliability in coherent systems: a bayesian weibull model

Bhering, Felipe Lunardi 28 June 2013 (has links)
O principal objetivo desse trabalho é introduzir um modelo geral bayesiano Weibull hierárquico para dados censurados que estima a função de confiabilidade de cada componente para sistemas de confiabilidade coerentes. São introduzidos formas de estimação mais sólidas, sem a inserção de estimativas médias nas funções de confiabilidade (estimador plug-in). Através desse modelo, são expostos e solucionados exemplos na área de confiabilidade como sistemas em série, sistemas em paralelo, sistemas k-de-n, sistemas bridge e um estudo clínico com dados censurados intervalares. As soluções consideram que as componentes tem diferentes distribuições, e nesse caso, o sistema bridge ainda não havia solução na literatura. O modelo construído é geral e pode ser utilizado para qualquer sistema coerente e não apenas para dados da área de confiabilidade, como também na área de sobrevivência, dentre outros. Diversas simulações com componentes com diferentes proporções de censura, distintas médias, três tipos de distribuições e tamanhos de amostra foram feitas em todos os sistemas para avaliar a eficácia do modelo. / The main purpose of this work is to introduce a general bayesian Weibull hierarchical model for censored data which estimates each reliability components function from coherent systems. Its introduced estimation procedures which do not consider plug-in estimators. Also, its exposed and solved with this model examples in reliability area such as series systems, parallel systems, k-out-of-n systems, bridge systems and a clinical study with interval censoring data. The problem of bridge system hadnt a solution before for the case of each component with different distribution. Actually, this model is general and can be used to analyse any kind of coherent system and censored data, not only reliability ones, but also survival data and others. Several components simulations with different censored proportions, distinct means, three kinds of distributions and sample size were made in all systems to evaluate model efficiency.
114

Introduction to Probability Theory

Chen, Yong-Yuan 25 May 2010 (has links)
In this paper, we first present the basic principles of set theory and combinatorial analysis which are the most useful tools in computing probabilities. Then, we show some important properties derived from axioms of probability. Conditional probabilities come into play not only when some partial information is available, but also as a tool to compute probabilities more easily, even when partial information is unavailable. Then, the concept of random variable and its some related properties are introduced. For univariate random variables, we introduce the basic properties of some common discrete and continuous distributions. The important properties of jointly distributed random variables are also considered. Some inequalities, the law of large numbers and the central limit theorem are discussed. Finally, we introduce additional topics the Poisson process.
115

Confiabilidade em sistemas coerentes: um modelo bayesiano Weibull. / Reliability in coherent systems: a bayesian weibull model

Felipe Lunardi Bhering 28 June 2013 (has links)
O principal objetivo desse trabalho é introduzir um modelo geral bayesiano Weibull hierárquico para dados censurados que estima a função de confiabilidade de cada componente para sistemas de confiabilidade coerentes. São introduzidos formas de estimação mais sólidas, sem a inserção de estimativas médias nas funções de confiabilidade (estimador plug-in). Através desse modelo, são expostos e solucionados exemplos na área de confiabilidade como sistemas em série, sistemas em paralelo, sistemas k-de-n, sistemas bridge e um estudo clínico com dados censurados intervalares. As soluções consideram que as componentes tem diferentes distribuições, e nesse caso, o sistema bridge ainda não havia solução na literatura. O modelo construído é geral e pode ser utilizado para qualquer sistema coerente e não apenas para dados da área de confiabilidade, como também na área de sobrevivência, dentre outros. Diversas simulações com componentes com diferentes proporções de censura, distintas médias, três tipos de distribuições e tamanhos de amostra foram feitas em todos os sistemas para avaliar a eficácia do modelo. / The main purpose of this work is to introduce a general bayesian Weibull hierarchical model for censored data which estimates each reliability components function from coherent systems. Its introduced estimation procedures which do not consider plug-in estimators. Also, its exposed and solved with this model examples in reliability area such as series systems, parallel systems, k-out-of-n systems, bridge systems and a clinical study with interval censoring data. The problem of bridge system hadnt a solution before for the case of each component with different distribution. Actually, this model is general and can be used to analyse any kind of coherent system and censored data, not only reliability ones, but also survival data and others. Several components simulations with different censored proportions, distinct means, three kinds of distributions and sample size were made in all systems to evaluate model efficiency.
116

Schädigungsprognose mittels Homogenisierung und mikromechanischer Materialcharakterisierung

Goldmann, Joseph 01 October 2018 (has links)
In der vorliegenden Arbeit wird die Frage untersucht, ob effektive Eigenschaften von Verbunden auch nach dem Auftreten einer Dehnungslokalisierung aufgrund von entfestigendem Materialverhalten noch durch numerische Homogenisierungsmethoden berechnet werden können. Ihr Nutzen für diesen Anwendungsfall wird in der Literatur kritisch beurteilt. Aus diesem Grund werden hier systematisch alle Teilaufgaben betrachtet, die zu diesem Zweck gelöst werden müssen. Die erste dieser Aufgaben ist die Charakterisierung der einzelnen Verbundbestandteile. Zur Demonstration einer experimentell gestützten Charakterisierung wird ein glasfaserverstärktes Epoxidharz als Beispielmaterial gewählt. Neben der Beschreibung von Faser- und Matrixmaterial wird besonderes Augenmerk auf die Charakterisierung der Grenzschicht zwischen beiden gelegt. Die für die Glasfasern vorliegenden Festigkeitsmessungen entsprechen nicht der Kettenhypothese. Daher werden zahlreiche Verallgemeinerungen der Weibull-Verteilung untersucht, um störende Effekte zu erfassen. Schließlich werden Wahrscheinlichkeitsverteilungen hergeleitet, die Faserbrüche im Bereich der Einspannung einbeziehen. Die Messwerte können von diesen Verteilungen gut wiedergegeben werden. Zusätzlich macht ihre Anwendung das aufwändige Aussortieren und Wiederholen jener Experimente unnötig, bei denen der Faserbruch im Klemmbereich auftritt. Zur Modellierung der Grenzfläche wird ein Kohäsivzonengesetz entwickelt. Die Bestimmung seiner Parameter erfolgt anhand von Daten aus Pullout- und Einzelfaserfragmentierungsversuchen. Aus diesen ermittelte Festigkeiten und Energiefreisetzungsraten weisen eine sehr gute Übereinstimmung zwischen beiden Versuchen auf. Dabei erfolgt die Parameteridentifikation mithilfe von Finite-Elemente-Modellen anstatt der häufig genutzten vereinfachten analytischen Modelle, welche üblicherweise eine schlechtere Übereinstimmung erreichen. Sobald eine Dehnungslokalisierung auftritt, ist neben der Materialmodellierung auch das Homogenisierungsschema zu verallgemeinern. Zu diesem gehören die Generierung repräsentativer Volumenelemente, Randbedingungen (RB) und ein Mittelungsoperator. Anhand des aktuellen Standes der Literatur werden die Randbedingungen als ein signifikanter Schwachpunkt von Homogenisierungsverfahren erkannt. Daher erfolgt die Untersuchung periodischer RB, linearer Verschiebungsrandbedingungen und minimal kinematischer RB sowie zweier adaptiver RB, nämlich Lokalisierungspfad-ausgerichteter RB und generalisiert periodischer RB. Unter der Bezeichnung Tesselationsrandbedingungen wird ein weiterer Typ adaptiver RB vorgeschlagen. Zunächst erfolgt der Beweis, dass alle drei adaptiven RB die Hill-Mandel-Bedingung erfüllen. Des Weiteren wird mittels einer Modifikation der Hough-Transformation ein systematischer Fehler derselben bei der Bestimmung der Richtung von Lokalisierungszonen eliminiert. Schließlich werden die Eigenschaften aller Randbedingungen an verschiedenen Beispielen demonstriert. Dabei zeigt sich, dass nur Tesselationsrandbedingungen sowohl beliebige Richtungen von Lokalisierungszonen erlauben als auch fehlerhafte Lokalisierungen in Eckbereichen ausschließen. Zusammengefasst können in der Literatur geäußerte grundlegende Einschränkungen hinsichtlich der Anwendbarkeit numerischer Homogenisierungsverfahren beim Auftreten von Dehnungslokalisierungen aufgehoben werden. Homogenisierungsmethoden sind somit auch für entfestigendes Materialverhalten anwendbar. / The thesis at hand is concerned with the question if numerical homogenization schemes can be of use in deriving effective material properties of composite materials after the onset of strain localization due to strain softening. In this case, the usefulness of computational homogenization methods has been questioned in the literature. Hence, all the subtasks to be solved in order to provide a successful homogenization scheme are investigated herein. The first of those tasks is the characterization of the constituents, which form the composite. To allow for an experimentally based characterization an exemplary composite has to be chosen, which herein is a glass fiber reinforced epoxy. Hence the constituents to be characterized are the epoxy and the glass fibers. Furthermore, special attention is paid to the characterization of the interface between both materials. In case of the glass fibers, the measured strength values do not comply with the weakest link hypothesis. Numerous generalizations of the Weibull distribution are investigated, to account for interfering effects. Finally, distributions are derived, that incorporate the possibility of failure inside the clamped fiber length. Application of such a distribution may represent the measured data quite well. Additionally, it renders the cumbersome process of sorting out and repeating those tests unnecessary, where the fiber fails inside the clamps. Identifying the interface parameters of the proposed cohesive zone model relies on data from pullout and single fiber fragmentation tests. The agreement of both experiments in terms of interface strength and energy release rate is very good, where the parameters are identified by means of an evaluation based on finite element models. Also, the agreement achieved is much better than the one typically reached by an evaluation based on simplified analytical models. Beside the derivation of parameterized material models as an input, the homogenization scheme itself needs to be generalized after the onset of strain localization. In an assessment of the current state of the literature, prior to the generation of representative volume elements and the averaging operator, the boundary conditions (BC) are identified as a significant issue of such a homogenization scheme. Hence, periodic BC, linear displacement BC and minimal kinematic BC as well as two adaptive BC, namely percolation path aligned BC and generalized periodic BC are investigated. Furthermore, a third type of adaptive BC is proposed, which is called tesselation BC. Firstly, the three adaptive BC are proven to fulfill the Hill-Mandel condition. Secondly, by modifying the Hough transformation an unbiased criterion to determine the direction of the localization zone is given, which is necessary for adaptive BC. Thirdly, the properties of all the BC are demonstrated in several examples. These show that tesselation BC are the only type, that allows for arbitrary directions of localization zones, yet is totally unsusceptible to spurious localization zones in corners of representative volume elements. Altogether, fundamental objections, that have been raised in the literature against the application of homogenization in situations with strain localization, are rebutted in this thesis. Hence, the basic feasibility of homogenization schemes even in case of strain softening material behavior is shown.

Page generated in 0.0815 seconds