• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 53
  • 10
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 83
  • 83
  • 66
  • 63
  • 26
  • 22
  • 21
  • 18
  • 17
  • 12
  • 11
  • 10
  • 10
  • 9
  • 8
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

Predictive reliabilities for electronic components

Nagarur, Nagendra N. January 1988 (has links)
A reliability model to study the behavior of an electronic component subject to several failure mechanisms ls developed. The mechanisms considered for the analysis are of degradation type where the number of defects for a mechanism increases with time, eventually causing the failure of the component. The failure pattern of the component subject to a single mechanism · with given initial and final number of defects is modelled as a pure birth process. Failure time for this mechanism is expressed as the first passage time of the birth process to state k from initial state l. First passage time distribution is derived for different forms of transition rates. When the initial and final states of the process are considered as random, the failure time is expressed as the mixture distribution obtained from the conditional first passage time distributions. The mixture distributions are well represented by a Weibull distribution. A computer program is developed to compute the parameters of the Weibull distribution iteratively by the method of matching moments. The approximation results are statistically validated. The results for a single mechanism are extended to the case of multiple mechanisms. Extreme·value theory and competing risk theory are applied to analyze the simultaneous effects of multiple mechanisms. lt is shown that the aggregate failure time distribution has a Weibull form for both the theories. The model explains the influence of physical and chemical properties of the component and the operating conditions on the failure times. It can be used for accelerated testing and for lncorporating reliability at product design stage. / Ph. D.
32

Överlevnadsanalys i tjänsteverksamhet : Tidspåverkan i överklagandeprocessen på Migrationsverket / Survival analysis in service : Time-effect in the process of appeal at the Swedish Migration Board

Minya, Kristoffer January 2014 (has links)
Migrationsverket är en myndighet som prövar ansökningar från personer som vill söka skydd, ha medborgarskap, studera eller vill jobba i Sverige. Då det på senare tid varit en stor ökning i dessa ansökningar har tiden för vilket ett beslut tar ökat. Varje typ av ansökning (exempelvis medborgarskap) är en process som består av flera steg. Hur beslutet går igenom dessa steg kallas för flöde. Migrationsverket vill därför öka sin flödeseffektivitet. När beslutet är klart och personen tagit del av det men inte är nöjd kan denne överklaga. Detta är en av de mest komplexa processerna på Migrationsverket. Syftet är analysera hur lång tid denna process tar och vilka steg i processen som påverkar tiden. Ett steg (som senare visar sig ha en stor effekt på tiden) är yttranden. Det är när domstolen begär information om vad personen som överklagar har att säga om varför denne överklagar. För att analysera detta var två metoder relevanta, accelerated failure time (AFT) och \multi-state models (MSM). Den ena kan predicera tid till händelse (AFT) medan den andra kan analysera effekten av tidspåverkan (MSM) i stegen. Yttranden tidigt i processen har stor betydelse för hur snabbt en överklagan får en dom samtidigt som att antal yttranden ökar tiden enormt. Det finns andra faktorer som påverkar tiden men inte i så stor grad som yttranden. Då yttranden tidigt i processen samtidigt som antal yttranden har betydelse kan flödeseffektiviteten ökas med att ta tid på sig att skriva ett informativt yttrande som gör att domstolen inte behöver begära flera yttranden. / The Swedish Migration Board is an agency that review applications from individuals who wish to seek shelter, have citizenship, study or want to work in Sweden. In recent time there has been a large increase in applications and the time for which a decision is made has increased. Each type of application (such as citizenship) is a process consisting of several stages. How the decision is going through these steps is called flow. The Swedish Migration Board would therefore like to increase their flow efficiency. When the decision is made and the person has take part of it but is not satisfied, he can appeal. This is one of the most complex processes at the Board. The aim is to analyze how long this process will take and what steps in the process affects the time. One step (which was later found to have a significant effect on time) is opinions. This is when the court requests information on what the person is appealing has to say about why he is appealing. To analyze this, two methods were relevant, accelerated failure time (AFT) and the multi-state models (MSM). One can predict time to event (AFT), the other to analyze the effect of time-manipulation (MSM) in the flow. Opinions early in the process is crucial to how quickly an appeal get judgment while the number of opinions increases the time enormously. There are other factors that affect the time but not so much as opinions. The flow efficiency can be increased by taking time to write an informative opinion which allows the court need not to ask for more opinions.
33

A study of the robustness of Cox's proportional hazards model used in testing for covariate effects

Fei, Mingwei January 1900 (has links)
Master of Arts / Department of Statistics / Paul Nelson / There are two important statistical models for multivariate survival analysis, proportional hazards(PH) models and accelerated failure time(AFT) model. PH analysis is most commonly used multivariate approach for analysing survival time data. For example, in clinical investigations where several (known) quantities or covariates, potentially affect patient prognosis, it is often desirable to investigate one factor effect adjust for the impact of others. This report offered a solution to choose appropriate model in testing covariate effects under different situations. In real life, we are very likely to just have limited sample size and censoring rates(people dropping off), which cause difficulty in statistical analysis. In this report, each dataset is randomly repeated 1000 times from three different distributions (Weibull, Lognormal and Loglogistc) with combination of sample sizes and censoring rates. Then both models are evaluated by hypothesis testing of covariate effect using the simulated data using the derived statistics, power, type I error rate and covergence rate for each situation. We would recommend PH method when sample size is small(n<20) and censoring rate is high(p>0.8). In this case, both PH and AFT analyses may not be suitable for hypothesis testing, but PH analysis is more robust and consistent than AFT analysis. And when sample size is 20 or above and censoring rate is 0.8 or below, AFT analysis will have slight higher convergence rate and power than PH, but not much improvement in Type I error rates when sample size is big(n>50) and censoring rate is low(p<0.3). Considering the privilege of not requiring knowledge of distribution for PH analysis, we concluded that PH analysis is robust in hypothesis testing for covariate effects using data generated from an AFT model.
34

Marginal Screening on Survival Data

Huang, Tzu Jung January 2017 (has links)
This work develops a marginal screening test to detect the presence of significant predictors for a right-censored time-to-event outcome under a high-dimensional accelerated failure time (AFT) model. Establishing a rigorous screening test in this setting is challenging, not only because of the right censoring, but also due to the post-selection inference. The oracle property in such situations fails to ensure adequate control of the family-wise error rate, and this raises questions about the applicability of standard inferential methods. McKeague and Qian (2015) constructed an adaptive resampling test to circumvent this problem under ordinary linear regression. To accommodate right censoring, we develop a test statistic based on a maximally selected Koul--Susarla--Van Ryzin estimator from a marginal AFT model. A regularized bootstrap method is used to calibrate the test. Our test is more powerful and less conservative than the Bonferroni correction and other competing methods. This proposed method is evaluated in simulation studies and applied to two real data sets.
35

Survival analysis of listed firms in Hong Kong.

January 2007 (has links)
Li, Li. / Thesis (M.Phil.)--Chinese University of Hong Kong, 2007. / Includes bibliographical references (leaves 34-36). / Abstracts in English and Chinese. / Chapter Chapter One --- Introduction --- p.1 / Chapter Chapter Two --- Methodology --- p.5 / Chapter Chapter Three --- Data --- p.9 / Chapter 3.1 --- Data Description --- p.9 / Chapter 3.2 --- Selection of Covariate --- p.13 / Chapter Chapter Four --- Empirical Analysis --- p.20 / Chapter 4.1 --- General Survival Analysis by Cox PH Model --- p.20 / Chapter 4.2 --- Competing Risk Analysis of Listed Firms --- p.24 / Chapter 4.3 --- Robustness Check --- p.28 / Chapter Chapter Five --- Conclusion --- p.30 / Appendix 1 --- p.32 / Appendix II --- p.33 / Reference --- p.34 / Tables --- p.37 / Figures --- p.58
36

Statistical inference in high dimensional linear and AFT models

Chai, Hao 01 July 2014 (has links)
Variable selection procedures for high dimensional data have been proposed and studied by a large amount of literature in the last few years. Most of the previous research focuses on the selection properties as well as the point estimation properties. In this paper, our goal is to construct the confidence intervals for some low-dimensional parameters in the high-dimensional setting. The models we study are the partially penalized linear and accelerated failure time models in the high-dimensional setting. In our model setup, all variables are split into two groups. The first group consists of a relatively small number of variables that are more interesting. The second group consists of a large amount of variables that can be potentially correlated with the response variable. We propose an approach that selects the variables from the second group and produces confidence intervals for the parameters in the first group. We show the sign consistency of the selection procedure and give a bound on the estimation error. Based on this result, we provide the sufficient conditions for the asymptotic normality of the low-dimensional parameters. The high-dimensional selection consistency and the low-dimensional asymptotic normality are developed for both linear and AFT models with high-dimensional data.
37

Diagnostic modeling and diagnosability evaluation of mechanical systems

Clark, Garrett E. 23 November 1993 (has links)
Consideration of diagnosability in product design promises to increase product quality by reducing maintenance time without increasing cost or decreasing reliability. Methods for investigating the diagnosability of mechanical and electro-mechanical systems are described and are applied to the Bleed Air Control System (BACS) on the Boeing 747-400. The BACS is described and a diagnostic model is developed using information from the system Failure Modes and Effects Analysis. Emphasis is placed on the relationships between the system's functions and its components. Two metrics for the evaluation of system diagnosability and two metrics for the evaluation of component diagnosability are defined. These metrics emphasize diagnostic ambiguity and are combined with the probability of different system failures to weight the effects of each failure. Three modified systems are produced by reassigning functions from one component to another. The resulting effects on the system and component diagnosability are evaluated. We show that by changing these relationships system diagnosability can be improved without adding sensors or other components. / Graduation date: 1994
38

A generalization of rank tests based on interval-censored failure time data and its application to AIDS studies.

Kuo, Yu-Yu 11 July 2000 (has links)
In this paper we propose a generalized rank test based on discrete interval-censored failure time data to determine whether two lifetime populations come from the same distribution. It reduces to the Logrank test or Wilcoxon test when one has exact or right-censored data. Simulation shows that the proposed test performs pretty satisfactory. An example is presented to demonstrate how the proposed test can be applied in AIDS study.
39

Generalized rank tests for univariate and bivariate interval-censored failure time data

Sun, De-Yu 20 June 2003 (has links)
In Part 1 of this paper, we adapt Turnbull¡¦s algorithm to estimate the distribution function of univariate interval-censored and truncated failure time data. We also propose four non-parametric tests to test whether two groups of the data come from the same distribution. The powers of proposed test statistics are compared by simulation under different distributions. The proposed tests are then used to analyze an AIDS study. In Part 2, for bivariate interval-censored data, we propose some models of how to generate the data and several methods to measure the correlation between the two variates. We also propose several nonparametric tests to determine whether the two variates are mutually independent or whether they have the same distribution. We demonstrate the performance of these tests by simulation and give an application to AIDS study¡]ACTG 181¡^.
40

The estimation of the truncation ratio and an algorithm for the parameter estimation in the random interval truncation model.

Zhu, Huang-Xu 01 August 2003 (has links)
For interval-censored and truncated failure time data, the truncation ratio is unknown. In this paper, we propose an algorithm, similar to Turnbull's, to estimate the parameters. The truncation ratio for the interval-censored and truncated failure time data can also be estimated by the convergence result of the algorithm. A simulation study is proposed to compare with Turnbull (1976). Our algorithm seems to have better result.

Page generated in 0.0939 seconds