Master of Science / Department of Statistics / Paul I. Nelson / The principle of least squares applied to regression models estimates parameters by minimizing the mean of squared residuals. Least squares estimators are optimal under normality but can perform poorly in the presence of outliers. This well known lack of robustness motivated the development of alternatives, such as least median of squares estimators obtained by minimizing the median of squared residuals. This report uses simulation to examine and compare the robustness of least median of squares estimators and least squares estimators of the slope of a regression line through the origin in terms of bias and mean squared error in a variety of conditions containing outliers created by using mixtures of normal and heavy tailed distributions. It is found that least median of squares estimation is almost as good as least squares estimation under normality and can be much better in the presence of outliers.
Identifer | oai:union.ndltd.org:KSU/oai:krex.k-state.edu:2097/7045 |
Date | January 1900 |
Creators | Paranagama, Thilanka Dilruwani |
Publisher | Kansas State University |
Source Sets | K-State Research Exchange |
Language | en_US |
Detected Language | English |
Type | Report |
Page generated in 0.0023 seconds