In obtaining a regression fit to a set of data, ordinary least squares regression depends directly on the parametric model formulated by the researcher. If this model is incorrect, a least squares analysis may be misleading. Alternatively, nonparametric regression (kernel or local polynomial regression, for example) has no dependence on an underlying parametric model, but instead depends entirely on the distances between regressor coordinates and the prediction point of interest. This procedure avoids the necessity of a reliable model, but in using no information from the researcher, may fit to irregular patterns in the data. The proper combination of these two regression procedures can overcome their respective problems. Considered is the situation where the researcher has an idea of which model should explain the behavior of the data, but this model is not adequate throughout the entire range of the data. An extension of partial linear regression and two methods of model robust regression are developed and compared in this context. These methods involve parametric fits to the data and nonparametric fits to either the data or residuals. The two fits are then combined in the most efficient proportions via a mixing parameter. Performance is based on bias and variance considerations. / Ph. D. / incomplete_metadata
Identifer | oai:union.ndltd.org:VTETD/oai:vtechworks.lib.vt.edu:10919/49937 |
Date | January 1995 |
Creators | Mays, James Edward |
Contributors | Statistics, Birch, Jeffrey B., Myers, Raymond, Smith, Eric P., Reynolds, Marion R. Jr., Coakley, Clint W. |
Publisher | Virginia Polytechnic Institute and State University |
Source Sets | Virginia Tech Theses and Dissertation |
Language | English |
Detected Language | English |
Type | Dissertation, Text |
Format | x, 200 leaves, application/pdf, application/pdf |
Rights | In Copyright, http://rightsstatements.org/vocab/InC/1.0/ |
Relation | OCLC# 34347967 |
Page generated in 0.0019 seconds