Spelling suggestions: "subject:"degression analysis."" "subject:"aregression analysis.""
11 |
Robust inferential procedures applied to regression /Agard, David B., January 1990 (has links)
Thesis (Ph. D.)--Virginia Polytechnic Institute and State University, 1990. / Vita. Abstract. Includes bibliographical references (leaves 159-161). Also available via the Internet.
|
12 |
Comparison and evaluation of the effect of outliers on ordinary least squares and Theil nonparametric regression with the evaluation of standard error estimates for the Theil nonparametric regression method /Wasser, Thomas E. January 1998 (has links)
Thesis (Ph. D.)--Lehigh University, 1999. / Includes vita. Bibliography: leaves 68-69.
|
13 |
Linear mixed models with equivalent predictors /Möls, Märt, January 2004 (has links) (PDF)
Thesis (doctoral)--Tartu University, 2004. / Vitae. Includes bibliographical references.
|
14 |
Multidimensional spectral estimation using iterative methodsWester, Roderick C. January 1990 (has links) (PDF)
Thesis (M.S. in Electrical Engineering)--Naval Postgraduate School, June 1990. / Thesis Advisor(s): Therrien, Charles W. ; Tummala, Murali. "June 1990." Description based on title screen as viewed on October 15, 2009. DTIC Identifier(s): Iterations, Covariance, Regression Analysis, Estimates. Author(s) subject terms: Autoregressive Spectral Estimation, Covariance Method. Includes bibliographical references (p. 35). Also available in print.
|
15 |
Some aspects of interval estimation in linear regression modelsKoschat, Martin Anselm. January 1984 (has links)
Thesis (Ph. D.)--University of Wisconsin--Madison, 1984. / Typescript. Vita. eContent provider-neutral record in process. Description based on print version record. Includes bibliographical references (leaves 107-108).
|
16 |
Predictor selection in linear regression L1 regularization of a subset of parameters and Comparison of L1 regularization and stepwise selectionHu, Qing. January 2007 (has links)
Thesis (M.S.) -- Worcester Polytechnic Institute. / Keywords: L1 regularization; Lasso; Feature selection; Covariate selection. Includes bibliographical references (leaves 21-22).
|
17 |
Understanding and extending the Li-Duan theorem /Snow, Gregory L. January 2000 (has links)
Thesis (Ph. D.)--University of Washington, 2000. / Vita. Includes bibliographical references (p. 99-100).
|
18 |
A class of operator splitting methods for least absolute shrinkage and selection operator (LASSO) modelsMo, Lili 01 January 2012 (has links)
No description available.
|
19 |
Model checking for general parametric regression modelsLi, Lingzhu 19 August 2019 (has links)
Model checking for regressions has drawn considerable attention in the last three decades. Compared with global smoothing tests, local smoothing tests, which are more sensitive to high-frequency alternatives, can only detect local alternatives dis- tinct from the null model at a much slower rate when the dimension of predictor is high. When the number of covariates is large, nonparametric estimations used in local smoothing tests lack efficiency. Corresponding tests then have trouble in maintaining the significance level and detecting the alternatives. To tackle the issue, we propose two methods under high but fixed dimension framework. Further, we investigate a model checking test under divergent dimension, where the numbers of covariates and unknown parameters go divergent with the sample size n. The first proposed test is constructed upon a typical kernel-based local smoothing test using projection method. Employed by projection and integral, the resulted test statistic has a closed form that depends only on the residuals and distances of the sample points. A merit of the developed test is that the distance is easy to implement compared with the kernel estimation, especially when the dimension is high. Moreover, the test inherits some feature of local smoothing tests owing to its construction. Although it is eventually similar to an Integrated Conditional Moment test in spirit, it leads to a test with a weight function that helps to collect more information from the samples than Integrated Conditional Moment test. Simulations and real data analysis justify the powerfulness of the test. The second test, which is a synthesis of local and global smoothing tests, aims at solving the slow convergence rate caused by nonparametric estimation in local smoothing tests. A significant feature of this approach is that it allows nonparamet- ric estimation-based tests, under the alternatives, also share the merits of existing empirical process-based tests. The proposed hybrid test can detect local alternatives at the fastest possible rate like the empirical process-based ones, and simultane- ously, retains the sensitivity to high-frequency alternatives from the nonparametric estimation-based ones. This feature is achieved by utilizing an indicative dimension in the field of dimension reduction. As a by-product, we have a systematic study on a residual-related central subspace for model adaptation, showing when alterna- tive models can be indicated and when cannot. Numerical studies are conducted to verify its application. Since the data volume nowadays is increasing, the numbers of predictors and un- known parameters are probably divergent as sample size n goes to infinity. Model checking under divergent dimension, however, is almost uncharted in the literature. In this thesis, an adaptive-to-model test is proposed to handle the divergent dimen- sion based on the two previous introduced tests. Theoretical results tell that, to get the asymptotic normality of the parameter estimator, the number of unknown parameters should be in the order of o(n1/3). Also, as a spinoff, we demonstrate the asymptotic properties of estimations for the residual-related central subspace and central mean subspace under different hypotheses.
|
20 |
Two problems involving regression analysisPratley, Kenneth G. (Kenneth George) January 1969 (has links)
No description available.
|
Page generated in 0.09 seconds