Consider a time series model which is stationary apart from a single shift in mean. If the time of a level shift is known, the least squares estimator of the magnitude of this level shift is a minimum variance unbiased estimator. If the time is unknown, however, this estimator is biased. Here, we first carry out extensive simulation studies to determine the relationship between the bias and three parameters of our time series model: the true magnitude of the level shift, the true time point and the autocorrelation of adjacent observations. Thereafter, we use two generalized additive models to generalize the simulation results. Finally, we examine to what extent the bias can be reduced by multiplying the least squares estimator with a shrinkage factor. Our results showed that the bias of the estimated magnitude of the level shift can be reduced when the level shift does not occur close to the beginning or end of the time series. However, it was not possible to simultaneously reduce the bias for all possible time points and magnitudes of the level shift.
Identifer | oai:union.ndltd.org:UPSALLA1/oai:DiVA.org:liu-84618 |
Date | January 2012 |
Creators | Liu, Wenjie |
Publisher | Linköpings universitet, Statistik, Linköpings universitet, Tekniska högskolan |
Source Sets | DiVA Archive at Upsalla University |
Language | English |
Detected Language | English |
Type | Student thesis, info:eu-repo/semantics/bachelorThesis, text |
Format | application/pdf |
Rights | info:eu-repo/semantics/openAccess |
Page generated in 0.002 seconds