31 |
Residual based selection of smoothing parametersMeise, Monika. January 2004 (has links) (PDF)
Duisburg, Essen, University Diss., 2004.
|
32 |
Some results concerning arrangements of half spaces and relative loss boundsForster, Jürgen. January 2002 (has links) (PDF)
Bochum, University, Diss., 2002.
|
33 |
Tests zur Modellspezifikation in der nichtlinearen RegressionBartels, Knut. January 2000 (has links) (PDF)
Potsdam, Universiẗat, Diss., 2000.
|
34 |
Klassische orthogonale Polynome und ihre Anwendung in der optimalen VersuchsplanungBiedermann, Stefanie. January 2002 (has links) (PDF)
Bochum, Universiẗat, Diss., 2003.
|
35 |
Predictor selection in linear regression L1 regularization of a subset of parameters and Comparison of L1 regularization and stepwise selectionHu, Qing. January 2007 (has links)
Thesis (M.S.) -- Worcester Polytechnic Institute. / Keywords: L1 regularization; Lasso; Feature selection; Covariate selection. Includes bibliographical references (leaves 21-22).
|
36 |
Understanding and extending the Li-Duan theorem /Snow, Gregory L. January 2000 (has links)
Thesis (Ph. D.)--University of Washington, 2000. / Vita. Includes bibliographical references (p. 99-100).
|
37 |
Ridge Estimation and its Modifications for Linear Regression with Deterministic or Stochastic PredictorsYounker, James January 2012 (has links)
A common problem in multiple regression analysis is having to engage in a bias variance trade-off in order to maximize the performance of a model. A number of
methods have been developed to deal with this problem over the years with a variety of
strengths and weaknesses. Of these approaches the ridge estimator is one of the most
commonly used. This paper conducts an examination of the properties of the ridge
estimator and several alternatives in both deterministic and stochastic environments.
We find the ridge to be effective when the sample size is small relative to the number
of predictors. However, we also identify a few cases where some of the alternative
estimators can outperform the ridge estimator. Additionally, we provide examples of
applications where these cases may be relevant.
|
38 |
A class of operator splitting methods for least absolute shrinkage and selection operator (LASSO) modelsMo, Lili 01 January 2012 (has links)
No description available.
|
39 |
Model checking for general parametric regression modelsLi, Lingzhu 19 August 2019 (has links)
Model checking for regressions has drawn considerable attention in the last three decades. Compared with global smoothing tests, local smoothing tests, which are more sensitive to high-frequency alternatives, can only detect local alternatives dis- tinct from the null model at a much slower rate when the dimension of predictor is high. When the number of covariates is large, nonparametric estimations used in local smoothing tests lack efficiency. Corresponding tests then have trouble in maintaining the significance level and detecting the alternatives. To tackle the issue, we propose two methods under high but fixed dimension framework. Further, we investigate a model checking test under divergent dimension, where the numbers of covariates and unknown parameters go divergent with the sample size n. The first proposed test is constructed upon a typical kernel-based local smoothing test using projection method. Employed by projection and integral, the resulted test statistic has a closed form that depends only on the residuals and distances of the sample points. A merit of the developed test is that the distance is easy to implement compared with the kernel estimation, especially when the dimension is high. Moreover, the test inherits some feature of local smoothing tests owing to its construction. Although it is eventually similar to an Integrated Conditional Moment test in spirit, it leads to a test with a weight function that helps to collect more information from the samples than Integrated Conditional Moment test. Simulations and real data analysis justify the powerfulness of the test. The second test, which is a synthesis of local and global smoothing tests, aims at solving the slow convergence rate caused by nonparametric estimation in local smoothing tests. A significant feature of this approach is that it allows nonparamet- ric estimation-based tests, under the alternatives, also share the merits of existing empirical process-based tests. The proposed hybrid test can detect local alternatives at the fastest possible rate like the empirical process-based ones, and simultane- ously, retains the sensitivity to high-frequency alternatives from the nonparametric estimation-based ones. This feature is achieved by utilizing an indicative dimension in the field of dimension reduction. As a by-product, we have a systematic study on a residual-related central subspace for model adaptation, showing when alterna- tive models can be indicated and when cannot. Numerical studies are conducted to verify its application. Since the data volume nowadays is increasing, the numbers of predictors and un- known parameters are probably divergent as sample size n goes to infinity. Model checking under divergent dimension, however, is almost uncharted in the literature. In this thesis, an adaptive-to-model test is proposed to handle the divergent dimen- sion based on the two previous introduced tests. Theoretical results tell that, to get the asymptotic normality of the parameter estimator, the number of unknown parameters should be in the order of o(n1/3). Also, as a spinoff, we demonstrate the asymptotic properties of estimations for the residual-related central subspace and central mean subspace under different hypotheses.
|
40 |
Two problems involving regression analysisPratley, Kenneth G. (Kenneth George) January 1969 (has links)
No description available.
|
Page generated in 0.058 seconds