• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 7
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 27
  • 27
  • 10
  • 8
  • 6
  • 6
  • 5
  • 5
  • 5
  • 4
  • 4
  • 4
  • 3
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Bayesian nonparametric survival analysis via Markov processes

Nieto-Barajas, Luis E. January 2001 (has links)
No description available.
2

Empirical Likelihood Confidence Intervals for the Ratio and Difference of Two Hazard Functions

Zhao, Meng 21 July 2008 (has links)
In biomedical research and lifetime data analysis, the comparison of two hazard functions usually plays an important role in practice. In this thesis, we consider the standard independent two-sample framework under right censoring. We construct efficient and useful confidence intervals for the ratio and difference of two hazard functions using smoothed empirical likelihood methods. The empirical log-likelihood ratio is derived and its asymptotic distribution is a chi-squared distribution. Furthermore, the proposed method can be applied to medical diagnosis research. Simulation studies show that the proposed EL confidence intervals have better performance in terms of coverage accuracy and average length than the traditional normal approximation method. Finally, our methods are illustrated with real clinical trial data. It is concluded that the empirical likelihood methods provide better inferential outcomes.
3

Empirical Likelihood Confidence Intervals for the Ratio and Difference of Two Hazard Functions

Zhao, Meng 21 July 2008 (has links)
In biomedical research and lifetime data analysis, the comparison of two hazard functions usually plays an important role in practice. In this thesis, we consider the standard independent two-sample framework under right censoring. We construct efficient and useful confidence intervals for the ratio and difference of two hazard functions using smoothed empirical likelihood methods. The empirical log-likelihood ratio is derived and its asymptotic distribution is a chi-squared distribution. Furthermore, the proposed method can be applied to medical diagnosis research. Simulation studies show that the proposed EL confidence intervals have better performance in terms of coverage accuracy and average length than the traditional normal approximation method. Finally, our methods are illustrated with real clinical trial data. It is concluded that the empirical likelihood methods provide better inferential outcomes.
4

Statistical analysis of lifetime data using new modified Weibull distributions

Al-Malki, Saad Jamaan January 2014 (has links)
The Weibull distribution is a popular and widely used distribution in reliability and in lifetime data analysis. Since 1958, the Weibull distribution has been modified by many researchers to allow for non-monotonic hazard functions. Many modifications of the Weibull distribution have achieved the above purpose. On the other hand, the number of parameters has increased, the forms of the survival and hazard functions have become more complicated and the estimation problems have risen. This thesis provides an extensive review of some discrete and continuous versions of the modifications of the Weibull distribution, which could serve as an important reference and encourage further modifications of the Weibull distribution. Four different modifications of the Weibull distribution are proposed to address some of the above problems using different techniques. First model, with five parameters, is constructed by considering a two-component serial system with one component following a Weibull distribution and another following a modified Weibull distribution. A new method has been proposed to reduce the number of parameters of the new modified Weibull distribution from five to three parameters to simplify the distribution and address the estimation problems. The reduced version has the same desirable properties of the original distribution in spite of having two less parameters. It can be an alternative distribution for all modifications of the Weibull distribution with bathtub shaped hazard rate functions. To deal with unimodal shaped hazard rates, the third model with four parameters, named as the exponentiated reduced modified Weibull distribution is introduced. This model is flexible, has a nice physical interpretation and has the ability to capture monotonically increasing, unimodal and bathtub shaped hazard rates. It is a generalization of the reduced modified Weibull distribution. The proposed distribution gives the best fit comparing to other modifications of the Weibull distribution including those having similar properties. A three-parameter discrete distribution is introduced based on the reduced distribution. It is one of only three discrete distributions allowing for bathtub shaped hazard rate functions. Four real data sets have applied to this distribution. The new distribution is shown to outperform at least three other models including the ones allowing for bathtub shaped hazard rate functions. The new models show flexibility and can be used to model different kinds of real data sets better than other modified versions of Weibull distribution including those having the same number of parameters. The mathematical properties and statistical inferences of the new models are studied. Based on a simulation study the performances of the MLEs of each model are assessed with respect to sample size n. We find no evidence that the generalized modified Weibull distribution can provide a better fit than the exponentiated Weibull distributionfor data sets exhibiting the modified unimodal hazard function.
5

A Computer Program for Survival Comparisons to a Standard Population

Moon, Steven Y., Woolson, Robert F., Bean, Judy A. 01 January 1979 (has links)
PROPHAZ is a computer program created for the analysis of survival data using the general proportional hazards model. It was designed specifically for the situation in which the underlying hazard function may be estimated from the mortality experience of a large reference population, but may be used for other problems as well. Input for the program includes the variables of interest as well as the information necessary for estimating the hazard function (demographic and mortality data). Regression coefficients for the variables of interest are obtained iteratively using the Newton-Raphson method. Utilizing large sample asymptotic theory, χ2 statistics are derived which may be used to test hypotheses of the form Cβ = 0. Input format is completely flexible for the variables of interest as well as the mortality data.
6

A comparison of some methods of modeling baseline hazard function in discrete survival models

Mashabela, Mahlageng Retang 20 September 2019 (has links)
MSc (Statistics) / Department of Statistics / The baseline parameter vector in a discrete-time survival model is determined by the number of time points. The larger the number of the time points, the higher the dimension of the baseline parameter vector which often leads to biased maximum likelihood estimates. One of the ways to overcome this problem is to use a simpler parametrization that contains fewer parameters. A simulation approach was used to compare the accuracy of three variants of penalised regression spline methods in smoothing the baseline hazard function. Root mean squared error (RMSE) analysis suggests that generally all the smoothing methods performed better than the model with a discrete baseline hazard function. No single smoothing method outperformed the other smoothing methods. These methods were also applied to data on age at rst alcohol intake in Thohoyandou. The results from real data application suggest that there were no signi cant di erences amongst the estimated models. Consumption of other drugs, having a parent who drinks, being a male and having been abused in life are associated with high chances of drinking alcohol very early in life. / NRF
7

New statistical methods to derive functional connectivity from multiple spike trains

Masud, Mohammad Shahed January 2011 (has links)
Analysis of functional connectivity of simultaneously recorded multiple spike trains is one of the major issues in the neuroscience. The progress of the statistical methods to the analysis of functional connectivity of multiple spike trains is relatively slow. In this thesis two statistical techniques are presented to the analysis of functional connectivity of multiple spike trains. The first method is known as the modified correlation grid (MCG). This method is based on the calculation of cross-correlation function of all possible pair-wise spike trains. The second technique is known as the Cox method. This method is based on the modulated renewal process (MRP). The original paper on the application of the Cox method (Borisyuk et al., 1985) to neuroscience data was used to analyse only pairs and triplets of spike trains. This method is further developed in this thesis to support simultaneously recorded of any possible set of multiple spike trains. A probabilistic model is developed to test the Cox method. This probabilistic model is based on the MRP. Due to the common probabilistic basis of the probabilistic model and the Cox method, the probabilistic model is a convenient technique to test the Cox method. A new technique based on a pair-wise analysis of Cox method known as the Cox metric is presented to find the groups of coupled spike trains. Another new technique known as motif analysis is introduced which is useful in identifying interconnections among the spike trains. This technique is based on the triplet-wise analysis of the Cox method. All these methods are applied to several sets of spike trains generated by the Enhanced Leaky and Integrate Fire (ELIF) model. The results suggest that these methods are successful for analysing functional connectivity of simultaneously recorded multiple spike trains. These methods are also applied to an experimental data recorded from cat’s visual cortex. The connection matrix derived from the experimental data by the Cox method is further applied to the graph theoretical methods.
8

Náhodné procesy v analýze spolehlivosti / Random Processes in Reliability Analysis

Chovanec, Kamil January 2011 (has links)
Title: Random Processes in Reliability Analysis Author: Kamil Chovanec Department: Department of Probability and Mathematical Statistics Supervisor: Doc. Petr Volf, CSc. Supervisor's e-mail address: volf@utia.cas.cz Abstract: The thesis is aimed at the reliability analysis with special em- phasis at the Aalen additive model. The result of hypothesis testing in the reliability analysis is often a process that converges to a Gaussian martingale under the null hypothesis. We can estimate the variance of the martingale using a uniformly consistent estimator. The result of this estimation is a new hypothesis about the process resulting from the original hypothesis. There are several ways to test for this hypothesis. The thesis presents some of these tests and compares their power for various models and sample sizes using Monte Carlo simulations. In a special case we derive a point that maximizes the asymptotic power of two of the tests. Keywords: Martingale, Aalen's additive model, hazard function 1
9

Survival Model and Estimation for Lung Cancer Patients.

Yuan, Xingchen 07 May 2005 (has links)
Lung cancer is the most frequent fatal cancer in the United States. Following the notion in actuarial math analysis, we assume an exponential form for the baseline hazard function and combine Cox proportional hazard regression for the survival study of a group of lung cancer patients. The covariates in the hazard function are estimated by maximum likelihood estimation following the proportional hazards regression analysis. Although the proportional hazards model does not give an explicit baseline hazard function, the baseline hazard function can be estimated by fitting the data with a non-linear least square technique. The survival model is then examined by a neural network simulation. The neural network learns the survival pattern from available hospital data and gives survival prediction for random covariate combinations. The simulation results support the covariate estimation in the survival model.
10

On the Impacts of Telecommuting over Daily Activity/Travel Behavior: A Comprehensive Investigation through Different Telecommuting Patterns

Asgari, Hamidreza 16 June 2015 (has links)
The interest in telecommuting stems from the potential benefits in alleviating traffic congestion, decreasing vehicle miles traveled (VMT), and improving air quality by reducing the necessity for travel between home and the workplace. Despite the potential economic, environmental, and social benefits, telecommuting has not been widely adopted, and there is little consensus on the actual impacts of telecommuting. One of the major hurdles is lack of a sound instrument to quantify the impacts of telecommuting on individuals’ travel behavior. As a result, the telecommuting phenomenon has not received proper attention in most transportation planning and investment decisions, if not completely ignored. This dissertation addresses the knowledge gap in telecommuting studies by examining several factors. First, it proposes a comprehensive outline to reveal and represent the complexity in telecommuting patterns. There are various types of telecommuting engagement, with different impacts on travel outcomes. It is necessary to identify and distinguish between those people for whom telecommuting involves a substitution of work travel and those for whom telecommuting is an ancillary activity. Secondly, it enhances the current modeling framework by supplementing the choice/frequency approach with daily telework dimensions, since the traditional approach fails to recognize the randomness of telecommuting engagement in a daily context. A multi-stage modeling structure is developed, which incorporates choice, frequency, engagement, and commute, as the fundamental dimensions of telecommuting activity. One pioneering perspective of this methodology is that it identifies non-regular telecommuters, who represent a significant share of daily telecommuters. Lastly, advanced statistical modeling techniques are employed to measure the actual impacts of each telecommuting arrangement on travelers’ daily activity-travel behavior, focusing on time-use analysis and work trip departure times. This research provides a systematic and sound instrument that advances the understanding of the benefits and potentials of telecommuting and impacts on travel outcomes. It is expected to facilitate policy and decision makers with higher accuracy and contribute to the better design and analysis of transportation investment decisions.

Page generated in 0.0894 seconds