Spelling suggestions: "subject:"moran cooperator"" "subject:"moran inoperator""
1 |
Tempering spatial autocorrelation in the residuals of linear and generalized models by incorporating selected eigenvectorsCervantes, Juan 01 August 2018 (has links)
In order to account for spatial correlation in residuals in regression models for areal and lattice data, different disciplines have developed distinct approaches. Bayesian spatial statistics typically has used a Gaussian conditional autoregressive (CAR) prior on random effects, while geographers utilize Moran's I statistic as a measure of spatial autocorrelation and the basis for creating spatial models. Recent work in both fields has recognized and built on a common feature of the two approaches, specifically the implicit or explicit incorporation into the linear predictor of eigenvectors of a matrix representing the spatial neighborhood structure. The inclusion of appropriate choices of these vectors effectively reduces the spatial autocorrelation found in the residuals.
We begin with extensive simulation studies to compare Bayesian CAR models, Restricted Spatial Regression (RSR), Bayesian Spatial Filtering (BSF), and Eigenvector Spatial Filtering (ESF) with respect to estimation of fixed-effect coefficients, prediction, and reduction of residual spatial autocorrelation. The latter three models incorporate the neighborhood structure of the data through the eigenvectors of a Moran operator.
We propose an alternative selection algorithm for all candidate predictors that avoids the ad hoc approach of RSR and selects on both model fit and reduction of autocorrelation in the residuals. The algorithm depends on the marginal posterior density a quantity that measures what proportion of the total variance can be explained by the measurement error. The algorithm selects candidate predictors that lead to a high probability that this quantity is large in addition to having large marginal posterior inclusion probabilities (PIP) according to model fit. Two methods were constructed. The first is based on orthogonalizing all of the candidate predictors while the second can be applied to the design matrix of candidate predictors without orthogonalization.
Our algorithm was applied to the same simulated data that compared the RSR, BSF and ESF models. Although our algorithm performs similarly to the established methods, the first of our selection methods shows an improvement in execution time. In addition, our approach is a statistically sound, fully Bayesian method.
|
Page generated in 0.0587 seconds