Spelling suggestions: "subject:"estatistics,"" "subject:"cstatistics,""
351 |
Parameter estimation for discretely observed continuous-time Markov chainsCramer, Roxy D. January 2001 (has links)
This thesis develops a method for estimating the parameters of continuous-time Markov chains discretely observed by Poisson sampling. The inference problem in this context is usually simplified by assuming the process to be time-homogeneous and that the process can be observed continuously for some observation period. But many real problems are not homogeneous; moreover, in practice it is often difficult to observe random processes continuously. In this work, the Dynkin Identity motivates a martingale estimating equation which is no more complicated a function of the parameters than the infinitesimal generator of the chain. The time-dependent generators of inhomogeneous chains therefore present no new obstacles. The Dynkin Martingale estimating equation derived here applies to processes discretely observed according to an independent Poisson process. Random observation of this kind alleviates the so-called aliasing problem, which can arise when continuous-time processes are observed discretely. Theoretical arguments exploit the martingale structure to obtain conditions ensuring strong consistency and asymptotic normality of the estimators. Simulation studies of a single-server Markov queue with sinusoidal arrivals test the performance of the estimators under different sampling schemes and against the benchmark maximum likelihood estimators based on continuous observation.
|
352 |
On the operating characteristics of some non-parametric methodologies for the classification of distributions by tail behaviorOtt, Richard Charles January 2005 (has links)
New methods for classifying tails of probability distributions based on data are proposed. Some methods apply the nonparametric theories of Rojo [35] and Schuster [36] and differ from classical extreme value theory and other well established methods. All the methods implement the extreme spacing of the data, the difference of the largest and second largest values. The results are then compared based on power properties to the classical technique of a Points Over Threshold model based on the Generalized Pareto Distribution (GPD).
The following topics are the foundation of this thesis:
Chapter 1. Review of classical extreme value theory and discussion on the class of medium-tailed distributions.
Chapter 2. Review of the tail classification schemes of Parzen, Schuster, and Rojo upon which the latter two suggest the usage of the Extreme Spacing (ES) as a possible classifying instrument. Additional subcategorizations are also provided for the schemes of Schuster and Rojo.
Chapter 3. Review of estimation methods for the Points Over Threshold GPD parameters for classification purposes. A Monte Carlo study classifying tails of many common distributions using the GPD by way of maximum likelihood is also provided.
Chapter 4. Three classification tests based on the ES are provided. The first is a test to decide whether a sample originates from a completely specified distribution such as Exp(1). The second classifies whether data originated from an exponential distribution with unknown parameter. The third classifies an underlying distribution as short-, medium-, or long-tailed. Also discussed, is the potential benefit of blocking the data before applying the above mentioned tests.
Chapter 5. Classifying specific data sets by way of the new methods. Some of the new ES methods may be applicable to the data when classical methods are inapplicable, for example when the GPD maximum likelihood numerical algorithm does not converge to yield a shape parameter estimate or when the variance of the shape parameter cannot be estimated since the parameter estimate is close to a parameter space endpoint. Even when classical methods are applicable, these tests can give a more thorough understanding of the tail behavior of the underlying distribution.
|
353 |
Aspects of functional data inference and its applicationsLee, Jong Soo January 2006 (has links)
We consider selected topics in estimation and testing of functional data. In many applications of functional data analysis, we aim to compare the sample functional data from two or more populations. However, the raw functional data often contains noise, some of which can be huge outliers. Hence, we must first perform the smoothing and estimation of functional data, but the existing methods for robust smoothing parameter selection are unsatisfactory. We present an efficient way to compute a smoothing parameter which can be generally applied to most robust smoothers. Then, we propose a procedure for testing pointwise difference of functional data in the two-sample framework. Our proposed method is a generalization of Hotelling's T2 test, and we utilize an adaptive truncation technique of Fan and Lin (1998) for dimension reduction and development of the test statistic. We show that our method performs well when compared with the existing testing procedures. Furthermore, we propose a method to detect the significantly different regions between curves. Once we determine that the samples curves from the two or more populations are significantly different overall, we want to look at the local regions of the curves and see where the differences occur. We present a modification of the multiple testing procedure of Westfall and Young (1993) for this testing method. Finally, we apply our proposed methods to the data from the fluorescence spectroscopic device. The fluorescence spectroscopic device is a medical device designed for early detection of cervical cancer, and the output from the device is a functional data, which makes the analysis challenging. The problems posed by this application have motivated the development of the methodologies in the present work, and we demonstrate that our methods work well in this application.
|
354 |
Galactic noise and the distance to Centaurus AShopbell, Patrick Lynn January 1993 (has links)
Deep B and R band photographic plates of the giant radio elliptical NGC 5128 (Centaurus A), obtained under photometric conditions at the f/3.3 prime focus of the AAT 3.9m telescope, clearly show surface brightness fluctuations from the luminous, stellar population. The variance of such fluctuations has been used by Tonry and collaborators to determine distances to nearby galaxies from CCD imagery. This thesis exploits the distinct statistical properties of photographic plates to perform a similar analysis employing this inherently nonlinear detector. The effects of threshold and saturation are discussed in detail. The final derived power spectra can be understood in terms of the noise components previously identified plus an additional component arising from fluctuations in the grain density. The analysis derives a distance to Centaurus A of 3.8 $\pm$ 1.1 Mpc, in close agreement with recent determinations. As a statistically independent test, this work provides strong support for the fluctuation distance method.
|
355 |
A STUDY OF ESTIMATION PROBLEMS INVOLVING MULTIPLE TRACESHOARD, ROBERT EARL January 1973 (has links)
No description available.
|
356 |
NONPARAMETRIC PROBABILITY DENSITY ESTIMATION BY OPTIMIZATION THEORETIC TECHNIQUESSCOTT, DAVID WARREN January 1976 (has links)
No description available.
|
357 |
AN ADAPTIVE ORTHOGONAL-SERIES ESTIMATOR FOR PROBABILITY DENSITY FUNCTIONSANDERSON, G. LEIGH January 1978 (has links)
No description available.
|
358 |
SOME RESULTS FOR ESTIMATING BIVARIATE DENSITIES USING KERNEL, ORTHOGONAL SERIES AND PENALIZED LIKELIHOOD PROCEDURESNEZAMES, DONNA DALANGAUSKAS January 1980 (has links)
In this work, three extensions of univariate nonparametric probability density estimators into two dimensions are analyzed in terms of statistical and numerical properties. The asymptotically optimal smoothing parameters of the bivariate kernel estimate are derived. Since the optimal smoothing parameters depend on prior knowledge of the underlying density function, an objective method based on functional iteration is investigated.
The second nonparametric estimator is formulated from a weighted orthogonal series. A data-based procedure that selects the smoothing parameters by minimizing an approximate expression of the integrated mean square error is investigated. The estimate is proven to be consistent with the asymptotic rate of convergence equivalent to that of the kernel method. A hybrid procedure is proposed that estimates the asymptotically optimal smoothing parameters in the kernel estimate by using the orthogonal series estimate in place of the density function.
The third estimate, a discrete maximum penalized-likelihood estimate, is proven to exist, be unique and be consistent pointwise almost surely. A procedure to numerically implement the scheme is presented. To compare the integrated mean square error with that of the kernel method, a preliminary simulation investigation is studied.
|
359 |
NONPARAMETRIC MODE ESTIMATION FOR HIGHER DIMENSIONAL DENSITIESBOSWELL, STEVEN BLAKE January 1984 (has links)
In this study a family of estimators is developed for local maxima, or modes, of a multivariate probability density function. The mode estimators are computationally feasible iterative optimization procedures utilizing nonparametric techniques of probability density estimation which generalize easily to sample spaces of arbitrary dimension. The estimators are proven to be strongly consistent for any distribution possessing mild continuity properties.
Three specific mode estimators are evaluated by extensive Monte Carlo testing upon samples from both classical unimodal and nonstandard unimodal and biomodal distributions. Detection of the presence of multiple modes is a matter of special concern in many investigations. Thus a global strategy is developed and tested to demonstrate the potential of the estimators for complete characterization of sample modality.
|
360 |
HISTOGRAM ESTIMATORS OF BIVARIATE DENSITIES (MULTIVARIATE, STATISTICS)HUSEMANN, JOYCE ANN STEVENS January 1986 (has links)
One-dimensional fixed-interval histogram estimators of univariate probability density functions are less efficient than the analogous variable-interval estimators which are constructed from intervals whose lengths are determined by the criterion of integrated mean squared error (IMSE) minimization. Similarly, two-dimensional fixed-cell-size histogram estimators of bivariate probability density functions are less efficient than variable-cell-size estimators whose cell sizes are determined from IMSE minimization. Only estimators whose cell sides are parallel to the coordinate axes are examined.
The estimators are classified according to the functional dependence of their cell dimensions upon x and y: each cell dimension of the Minimally Restricted Mesh depends upon both x and y; one cell dimension of the Semi-fixed-dimension Mesh is fixed, and the other depends upon either x alone or y alone; one cell dimensions of the Variable-dimensions Mesh I depends upon x and the other upon y; one cell dimension of the Variable-dimension Mesh II depends upon x alone or y alone and the other depends upon both x and y. The Minimally Restricted Mesh results in the smallest IMSE of the four types, but is not implementable. The other meshes are implementable and are listed above in order of decreasing IMSE. Random vectors from Dirichlet, mixed bivariate and elliptical bivariate normal distributions were generated and used to construct optimal histograms. The Variable-dimension Mesh II produced histograms having IMSEs from 20 to 90 percent smaller than those from histograms based upon optimal fixed-dimension meshes. The most substantial improvements were observed for mixed bivariate normal densities having strongly unequal variances. Modest improvements (20%) were observed for skewed densities and slightly elliptical densities, but no improvements were observed in cases of highly elliptical densities whose axes were rotated 45(DEGREES) from the coordinate axes.
|
Page generated in 0.0807 seconds