Spelling suggestions: "subject:"ld5655.v856 1982.538"" "subject:"ld5655.v856 1982.0538""
1 |
Empirical Bayes methods in time series analysisKhoshgoftaar, Taghi M. January 1982 (has links)
In the case of repetitive experiments of a similar type, where the parameters vary randomly from experiment to experiment, the Empirical Bayes method often leads to estimators which have smaller mean squared errors than the classical estimators.
Suppose there is an unobservable random variable θ, where θ ~ G(θ), usually called a prior distribution. The Bayes estimator of θ cannot be obtained in general unless G(θ) is known. In the empirical Bayes method we do not assume that G(θ) is known, but the sequence of past estimates is used to estimate θ.
This dissertation involves the empirical Bayes estimates of various time series parameters: The autoregressive model, moving average model, mixed autoregressive-moving average, regression with time series errors, regression with unobservable variables, serial correlation, multiple time series and spectral density function. In each case, empirical Bayes estimators are obtained using the asymptotic distributions of the usual estimators.
By Monte Carlo simulation the empirical Bayes estimator of first order autoregressive parameter, ρ, was shown to have smaller mean squared errors than the conditional maximum likelihood estimator for 11 past experiences. / Doctor of Philosophy
|
Page generated in 0.0464 seconds