Spelling suggestions: "subject:"degression analysis."" "subject:"aregression analysis.""
71 |
A combination procedure of universal kriging and logistic regression a thesis presented to the faculty of the Graduate School, Tennessee Technological University /Wu, Songfei. January 2008 (has links)
Thesis (M.S.)--Tennessee Technological University, 2008. / Title from title page screen (viewed on Aug. 26, 2009). Bibliography: leaves 24-26.
|
72 |
On sliced methods in dimension reductionLi, Yingxing. January 2005 (has links)
Thesis (M. Phil.)--University of Hong Kong, 2005 . / Title proper from title frame. Also available in printed format.
|
73 |
Nonparametric regressionWhitney, Paul David. January 1984 (has links)
Thesis (Ph. D.)--University of Wisconsin--Madison, 1984. / Typescript. Vita. Includes bibliographical references (leaves 92-98).
|
74 |
Regression with correlated errorsEsan, E. O. January 1981 (has links)
Thesis (Ph. D.)--University of Wisconsin--Madison, 1981. / Typescript. Vita. eContent provider-neutral record in process. Description based on print version record. Includes bibliographical references (leaves 69-71).
|
75 |
Asymptotic distributions of Buckley-James estimatorKong, Fanhui. January 2005 (has links)
Thesis (Ph. D.)--State University of New York at Binghamton, Department of Mathematical Sciences, 2005. / Includes bibliographical references.
|
76 |
A novel spectropolarimeter for determiation of sucrose and other optically active samplesCalleja-Amador, Carlos Enrique. Busch, Kenneth W. Busch, Marianna A. January 2006 (has links)
Thesis (M.S.)--Baylor University, 2006. / Includes bibliographical references (p. 77-80).
|
77 |
Analysis of clustered data : a combined estimating equations approach /Stoner, Julie Ann. January 2000 (has links)
Thesis (Ph. D.)--University of Washington, 2000. / Vita. Includes bibliographical references (leaves 147-153).
|
78 |
A model selection approach to partially linear regression /Bunea, Florentina, January 2000 (has links)
Thesis (Ph. D.)--University of Washington, 2000. / Vita. Includes bibliographical references (p. 140-145).
|
79 |
Variable selection in high dimensional semi-varying coefficient modelsChen, Chi 06 September 2013 (has links)
With the development of computing and sampling technologies, high dimensionality has become an important characteristic of commonly used science data, such as some data from bioinformatics, information engineering, and the social sciences. The varying coefficient model is a flexible and powerful statistical model for exploring dynamic patterns in many scientific areas. It is a natural extension of classical parametric models with good interpretability, and is becoming increasingly popular in data analysis. The main objective of thesis is to apply the varying coefficient model to analyze high dimensional data, and to investigate the properties of regularization methods for high-dimensional varying coefficient models. We first discuss how to apply local polynomial smoothing and the smoothly clipped absolute deviation (SCAD) penalized methods to estimate varying coefficient models when the dimension of the model is diverging with the sample size. Based on the nonconcave penalized method and local polynomial smoothing, we suggest a regularization method to select significant variables from the model and estimate the corresponding coefficient functions simultaneously. Importantly, our proposed method can also identify constant coefficients at same time. We investigate the asymptotic properties of our proposed method and show that it has the so called “oracle property.” We apply the nonparametric independence Screening (NIS) method to varying coefficient models with ultra-high-dimensional data. Based on the marginal varying coefficient model estimation, we establish the sure independent screening property under some regular conditions for our proposed sure screening method. Combined with our proposed regularization method, we can systematically deal with high-dimensional or ultra-high-dimensional data using varying coefficient models. The nonconcave penalized method is a very effective variable selection method. However, maximizing such a penalized likelihood function is computationally challenging, because the objective functions are nondifferentiable and nonconcave. The local linear approximation (LLA) and local quadratic approximation (LQA) are two popular algorithms for dealing with such optimal problems. In this thesis, we revisit these two algorithms. We investigate the convergence rate of LLA and show that the rate is linear. We also study the statistical properties of the one-step estimate based on LLA under a generalized statistical model with a diverging number of dimensions. We suggest a modified version of LQA to overcome its drawback under high dimensional models. Our proposed method avoids having to calculate the inverse of the Hessian matrix in the modified Newton Raphson algorithm based on LQA. Our proposed methods are investigated by numerical studies and in a real case study in Chapter 5.
|
80 |
Variable selection for high dimensional transformation modelLee, Wai Hong 01 January 2010 (has links)
No description available.
|
Page generated in 0.1068 seconds