121 |
A New Jackknife Empirical Likelihood Method for U-StatisticsMa, Zhengbo 25 April 2011 (has links)
U-statistics generalizes the concept of mean of independent identically distributed (i.i.d.) random variables and is widely utilized in many estimating and testing problems. The standard empirical likelihood (EL) for U-statistics is computationally expensive because of its onlinear constraint. The jackknife empirical likelihood method largely relieves computation burden by circumventing the construction of the nonlinear constraint. In this thesis, we adopt a new jackknife empirical likelihood method to make inference for the general volume under the ROC surface (VUS), which is one typical kind of U-statistics. Monte Carlo simulations are conducted to show that the EL confidence intervals perform well in terms of the coverage probability and average length for various sample sizes.
|
122 |
Local Likelihood for Interval-censored and Aggregated Point Process DataFan, Chun-Po Steve 03 March 2010 (has links)
The use of the local likelihood method (Tibshirani and Hastie, 1987; Loader, 1996) in the presence of interval-censored or aggregated data leads to a natural consideration of an EM-type strategy, or rather a local EM algorithm. In the thesis, we consider local EM to analyze the point process data that are either interval-censored or aggregated into regional counts. We specifically formulate local EM algorithms for density, intensity and risk estimation and implement the algorithms using a piecewise constant function. We demonstrate that the use of the piecewise constant function at the E-step explicitly results in an iteration that involves an expectation, maximization and smoothing step, or an EMS algorithm considered in Silverman, Jones, Wilson and Nychka (1990). Consequently, we reveal a previously unknown connection between local EM and the EMS algorithm.
From a theoretical perspective, local EM and the EMS algorithm complement each other. Although the statistical methodology literature often characterizes EMS methods as ad hoc, local likelihood suggests otherwise as the EMS algorithm arises naturally from a local likelihood consideration in the context of point processes. Moreover, the EMS algorithm not only serves as a convenient implementation of the local EM algorithm but also provides a set of theoretical tools to better understand the role of local EM. In particular, we present results that reinforce the suggestion that the pair of local EM and penalized likelihood are analogous to that of EM and likelihood. Applications include the analysis of bivariate interval-censored data as well as disease mapping for a rare disease, lupus, in the Greater Toronto Area.
|
123 |
Local Likelihood for Interval-censored and Aggregated Point Process DataFan, Chun-Po Steve 03 March 2010 (has links)
The use of the local likelihood method (Tibshirani and Hastie, 1987; Loader, 1996) in the presence of interval-censored or aggregated data leads to a natural consideration of an EM-type strategy, or rather a local EM algorithm. In the thesis, we consider local EM to analyze the point process data that are either interval-censored or aggregated into regional counts. We specifically formulate local EM algorithms for density, intensity and risk estimation and implement the algorithms using a piecewise constant function. We demonstrate that the use of the piecewise constant function at the E-step explicitly results in an iteration that involves an expectation, maximization and smoothing step, or an EMS algorithm considered in Silverman, Jones, Wilson and Nychka (1990). Consequently, we reveal a previously unknown connection between local EM and the EMS algorithm.
From a theoretical perspective, local EM and the EMS algorithm complement each other. Although the statistical methodology literature often characterizes EMS methods as ad hoc, local likelihood suggests otherwise as the EMS algorithm arises naturally from a local likelihood consideration in the context of point processes. Moreover, the EMS algorithm not only serves as a convenient implementation of the local EM algorithm but also provides a set of theoretical tools to better understand the role of local EM. In particular, we present results that reinforce the suggestion that the pair of local EM and penalized likelihood are analogous to that of EM and likelihood. Applications include the analysis of bivariate interval-censored data as well as disease mapping for a rare disease, lupus, in the Greater Toronto Area.
|
124 |
Copula Models for Multi-type Life History ProcessesDiao, Liqun January 2013 (has links)
This thesis considers statistical issues in the analysis of data in the studies of chronic diseases which involve modeling dependencies between life history processes using copula functions.
Many disease processes feature recurrent events which
represent events arising from an underlying chronic condition; these are often modeled as point processes.
In addition, however, there often exists a random variable which is realized upon the occurrence of each event, which is called a mark of the point process. When considered together, such processes are called marked point processes. A novel copula model for the marked point process is described here which uses copula functions to govern the association between marks and event times. Specifically, a copula function is used to link each mark with the next event time following the realization of that mark to reflect the pattern in the data wherein larger marks are often followed by longer time to the next event.
The extent of organ damage in an individual can often be characterized by ordered states, and interest frequently lies in modeling the rates at which individuals progress through these states. Risk factors can be studied and the effect of therapeutic interventions can be assessed based on relevant multistate models. When chronic diseases affect multiple organ systems, joint modeling of progression in several organ systems is also important.
In contrast to common intensity-based or frailty-based approaches to modelling, this thesis considers a copula-based framework for modeling and analysis. Through decomposition of the density and by use of conditional independence assumptions, an appealing joint model is obtained by assuming that the joint survival function of absorption transition times is governed by a multivariate copula function. Different approaches to estimation and inference are discussed and compared including composite likelihood and two-stage estimation methods. Special attention is paid to the case of interval-censored data arising from intermittent assessment.
Attention is also directed to use of copula models for more general scenarios with a focus on semiparametric two-stage estimation procedures. In this approach nonparametric or semiparametric estimates of the marginal survivor functions are obtained in the first stage and estimates of the association parameters are obtained in the second stage. Bivariate failure time models are considered for data under right-censoring and current status observation schemes, and right-censored multistate models. A new expression for the asymptotic variance of the second-stage estimator for the association parameter along with a way of estimating this for finite samples are presented under these models and observation schemes.
|
125 |
The extended empirical likelihoodWu, Fan 04 May 2015 (has links)
The empirical likelihood method introduced by Owen (1988, 1990) is a powerful
nonparametric method for statistical inference. It has been one of the most researched
methods in statistics in the last twenty-five years and remains to be a very active
area of research today. There is now a large body of literature on empirical likelihood
method which covers its applications in many areas of statistics (Owen, 2001).
One important problem affecting the empirical likelihood method is its poor accuracy,
especially for small sample and/or high-dimension applications. The poor
accuracy can be alleviated by using high-order empirical likelihood methods such as
the Bartlett corrected empirical likelihood but it cannot be completely resolved by
high-order asymptotic methods alone. Since the work of Tsao (2004), the impact of
the convex hull constraint in the formulation of the empirical likelihood on the finite sample
accuracy has been better understood, and methods have been developed to
break this constraint in order to improve the accuracy. Three important methods
along this direction are [1] the penalized empirical likelihood of Bartolucci (2007)
and Lahiri and Mukhopadhyay (2012), [2] the adjusted empirical likelihood by Chen,
Variyath and Abraham (2008), Emerson and Owen (2009), Liu and Chen (2010) and
Chen and Huang (2012), and [3] the extended empirical likelihood of Tsao (2013) and
Tsao and Wu (2013). The latter is particularly attractive in that it retains not only
the asymptotic properties of the original empirical likelihood, but also its important
geometric characteristics. In this thesis, we generalize the extended empirical likelihood
of Tsao and Wu (2013) to handle inferences in two large classes of one-sample
and two-sample problems.
In Chapter 2, we generalize the extended empirical likelihood to handle inference
for the large class of parameters defined by one-sample estimating equations, which
includes the mean as a special case. In Chapters 3 and 4, we generalize the extended
empirical likelihood to handle two-sample problems; in Chapter 3, we study the extended
empirical likelihood for the difference between two p-dimensional means; in
Chapter 4, we consider the extended empirical likelihood for the difference between
two p-dimensional parameters defined by estimating equations. In all cases, we give
both the first- and second-order extended empirical likelihood methods and compare
these methods with existing methods. Technically, the two-sample mean problem
in Chapter 3 is a special case of the general two-sample problem in Chapter 4. We
single out the mean case to form Chapter 3 not only because it is a standalone published
work, but also because it naturally leads up to the more difficult two-sample
estimating equations problem in Chapter 4. We note that Chapter 2 is the published paper Tsao and Wu (2014); Chapter 3 is
the published paper Wu and Tsao (2014). To comply with the University of Victoria
policy regarding the use of published work for thesis and in accordance with copyright
agreements between authors and journal publishers, details of these published work
are acknowledged at the beginning of these chapters. Chapter 4 is another joint paper
Tsao and Wu (2015) which has been submitted for publication. / Graduate / 0463 / fwu@uvic.ca
|
126 |
Likelihood-based classification of single trees in hemi-boreal forestsVallin, Simon January 2015 (has links)
Determining species of individual trees is important for forest management. In this thesis we investigate if it is possible to discriminate between Norway spruce, Scots pine and deciduous trees from airborne laser scanning data by using unique probability density functions estimated for each specie. We estimate the probability density functions in three different ways: by fitting a beta distribution, histogram density estimation and kernel density estimation. All these methods classifies single laser returns (and not segments of laser returns). The resulting classification is compared with a reference method based on features extracted from airborne laser scanning data.We measure how well a method performs by using the overall accuracy, that is the proportion of correctly predicted trees. The highest overall accuracy obtained by the methods we developed in this thesis is obtained by using histogram-density estimation where an overall accuracy of 83.4 percent is achieved. This result can be compared with the best result from the reference method that produced an overall accuracy of 84.1 percent. The fact that we achieve a high level of correctly classified trees indicates that it is possible to use these types of methods for identification of tree species. / Att kunna artbestämma enskilda träd är viktigt inom skogsbruket. I denna uppsats undersöker vi om det är möjligt att skilja mellan gran, tall och lövträd med data från en flygburen laserskanner genom att skatta en unik täthetsfunktion för varje trädslag. Täthetsfunktionerna skattas på tre olika sätt: genom att anpassa en beta-fördelning, skatta täthetsfunktionen med histogram samt skatta täthetsfunktionen med en kernel täthetsskattning. Alla dessa metoder klassificerar varje enskild laserretur (och inte segment av laserreturer). Resultaten från vår klassificering jämförs sedan med en referensmetod som bygger på särdrag från laserskanner data. Vi mäter hur väl metoderna presterar genom att jämföra den totala precisionen, vilket är andelen korrektklassificerade träd. Den högsta totala precisionen för de framtagna metoderna i denna uppsats erhölls med metoden som bygger på täthetsskattning med histogram. Precisionen för denna metod var 83,4 procent rättklassicerade träd. Detta kan jämföras med en rättklassificering på 84,1 procent vilket är det bästa resultatet för referensmetoderna. Att vi erhåller en så pass hög grad av rättklassificerade träd tyder på att de metoder som vi använder oss av är användbara för trädslagsklassificering.
|
127 |
Nonparametric Estimation and Inference for the Copula Parameter in Conditional CopulasAcar, Elif Fidan 14 January 2011 (has links)
The primary aim of this thesis is the elucidation of covariate effects on the dependence structure of random variables in bivariate or multivariate models. We develop a unified approach via a conditional copula model in which the copula is parametric and its parameter varies as the covariate. We propose a nonparametric procedure based on local likelihood to estimate the functional relationship between the copula parameter and the covariate, derive the asymptotic properties of the proposed estimator and outline the construction of pointwise confidence intervals. We also contribute a novel conditional copula selection method based on cross-validated prediction errors and a generalized likelihood ratio-type test to determine if the copula parameter varies significantly. We derive the asymptotic null distribution of the formal test. Using subsets of the Matched Multiple Birth and Framingham Heart Study datasets, we demonstrate the performance of these procedures via analyses of gestational age-specific twin birth weights and the impact of change in body mass index on the dependence between two consequent pulse pressures taken from the same subject.
|
128 |
Statistical method in a comparative study in which the standard treatment is superior to othersIkeda, Mitsuru, Shimamoto, Kazuhiro, Ishigaki, Takeo, Yamauchi, Kazunobu, 池田, 充, 山内, 一信 11 1900 (has links)
No description available.
|
129 |
Towards smooth particle filters for likelihood estimation with multivariate latent variablesLee, Anthony 11 1900 (has links)
In parametrized continuous state-space models, one can obtain estimates of the likelihood of the data for fixed parameters via the Sequential Monte Carlo methodology. Unfortunately, even if the likelihood is continuous in the parameters, the estimates produced by practical particle filters are not, even when common random numbers are used for each filter. This is because the same resampling step which drastically reduces the variance of the estimates also introduces discontinuities in the particles that are selected across filters when the parameters change.
When the state variables are univariate, a method exists that gives an estimator of the log-likelihood that is continuous in the parameters. We present a non-trivial generalization of this method using tree-based o(N²) (and as low as O(N log N)) resampling schemes that induce significant correlation amongst the selected particles across filters. In turn, this reduces the variance of the difference between the likelihood evaluated for different values of the parameters and the resulting estimator is considerably smoother than naively running the filters with common random numbers.
Importantly, in practice our methods require only a change to the resample operation in the SMC framework without the addition of any extra parameters and can therefore be used for any application in which particle filters are already used. In addition, excepting the optional use of interpolation in the schemes, there are no regularity conditions for their use although certain conditions make them more advantageous.
In this thesis, we first introduce the relevant aspects of the SMC methodology to the task of likelihood estimation in continuous state-space models and present an overview of work related to the task of smooth likelihood estimation. Following this, we introduce theoretically correct resampling schemes that cannot be implemented and the practical tree-based resampling schemes that were developed instead. After presenting the performance of our schemes in various applications, we show that two of the schemes are asymptotically consistent with the theoretically correct but unimplementable methods introduced earlier. Finally, we conclude the thesis with a discussion.
|
130 |
Modeling a non-homogeneous Markov process via time transformation /Hubbard, Rebecca Allana. January 2007 (has links)
Thesis (Ph. D.)--University of Washington, 2007. / Vita. Includes bibliographical references (p. 177-191).
|
Page generated in 0.0535 seconds