Return to search

Small Area Estimation in a Survey of Governments

<p> Small area composite estimators are weighted averages that attempt to balance the variability of the direct survey estimator against the bias of the synthetic estimator. Direct and synthetic estimators have competing properties, and finding an optimal weighted average can be challenging. </p><p> One example of a survey that utilizes small area estimation is the Annual Survey of Public Employment &amp; Payroll (ASPEP), which is conducted by the U.S. Census Bureau to collect data on the number and pay of federal, state, and local government civilian employees. Estimates of local government totals are calculated for domains created by crossing state and government function. To calculate estimates at such a detailed level, the Census Bureau uses small area methods that take advantage of auxiliary information from the most recent Census of Governments (CoG). During ASPEP's 2009 sample design, a composite estimator was used, and it was observed that the direct estimator has the desirable property of being greater than the corresponding raw sum of the data, whereas the synthetic estimator has the desirable property of being close to the most recent CoG total. </p><p> In this research, the design-based properties of various estimators and quantities in the composite methodology are studied via a large Monte Carlo simulation using CoG data. New estimators are constructed based on the underlying ideas of limited translation and James-Stein shrinkage. The simulation provides estimates of the design-based variance and mean squared error of every estimator under consideration, and more optimal domain-level composite weights are calculated. Based on simulation results, several limitations of the composite methodology are identified. </p><p> Explicit area-level models are developed that try to capture the spirit of the composite methodology and address its limitations in a unified and generalizable way. The models consist of hierarchical Bayesian extensions of the Fay-Herriot model and are characterized by novel combinations of components allowing for correlated sampling errors, multilevel structure, and t-distributed errors. Estimated variances and covariances from the Monte Carlo simulation are incorporated to take ASPEP's complex sample design into account. Posterior predictive checks and cross-validated posterior predictive checks based on selective discrepancy measures are used to help assess model fit. </p><p> It is observed that the t-normal models, which have t-distributed sampling errors, protect against unreasonable direct estimates and provide over-shrinkage towards the regression synthetic estimates. Also, the proportion of model estimates less than the corresponding raw sums is close to optimal. These empirical findings motivate a theoretical study of the shrinkage provided by the t-normal model. Another simulation is conducted to compare the shrinkage properties of this model and the Fay-Herriot model. </p><p> The methods in this research apply not just to ASPEP, but also to other surveys of governments, surveys of business establishments, and surveys of agriculture, which are similar in terms of sample design and the availability of auxiliary data from a quinquennial census. Ideas for future research include investigating alternative data transformations and discrepancy measures and developing hierarchical Bayesian models for time series and count data.</p>

Identiferoai:union.ndltd.org:PROQUEST/oai:pqdtoai.proquest.com:10075469
Date01 April 2016
CreatorsDumbacher, Brian Arthur
PublisherThe George Washington University
Source SetsProQuest.com
LanguageEnglish
Detected LanguageEnglish
Typethesis

Page generated in 0.0021 seconds