1 |
Convergent Validity of Variables Residualized By a Single Covariate: the Role of Correlated Error in Populations and SamplesNimon, Kim 05 1900 (has links)
This study examined the bias and precision of four residualized variable validity estimates (C0, C1, C2, C3) across a number of study conditions. Validity estimates that considered measurement error, correlations among error scores, and correlations between error scores and true scores (C3) performed the best, yielding no estimates that were practically significantly different than their respective population parameters, across study conditions. Validity estimates that considered measurement error and correlations among error scores (C2) did a good job in yielding unbiased, valid, and precise results. Only in a select number of study conditions were C2 estimates unable to be computed or produced results that had sufficient variance to affect interpretation of results. Validity estimates based on observed scores (C0) fared well in producing valid, precise, and unbiased results. Validity estimates based on observed scores that were only corrected for measurement error (C1) performed the worst. Not only did they not reliably produce estimates even when the level of modeled correlated error was low, C1 produced values higher than the theoretical limit of 1.0 across a number of study conditions. Estimates based on C1 also produced the greatest number of conditions that were practically significantly different than their population parameters.
|
2 |
Studies Of A Quantum Scheduling Algorithm And On Quantum Error CorrectionLu, Feng 01 January 2007 (has links)
Quantum computation has been a rich field of study for decades because it promises possible spectacular advances, some of which may run counter to our classically rooted intuitions. At the same time, quantum computation is still in its infancy in both theoretical and practical areas. Efficient quantum algorithms are very limited in number and scope; no real breakthrough has yet been achieved in physical implementations. Grover's search algorithm can be applied to a wide range of problems; even problems not generally regarded as searching problems can be reformulated to take advantage of quantum parallelism and entanglement leading to algorithms which show a square root speedup over their classical counterparts. This dissertation discusses a systematic way to formulate such problems and gives as an example a quantum scheduling algorithm for an R||C_max problem. This thesis shows that quantum solution to such problems is not only feasible but in some cases advantageous. The complexity of the error correction circuitry forces us to design quantum error correction codes capable of correcting only a single error per error correction cycle. Yet, time-correlated errors are common for physical implementations of quantum systems; an error corrected during a certain cycle may reoccur in a later cycle due to physical processes specific to each physical implementation of the qubits. This dissertation discusses quantum error correction for a restricted class of time-correlated errors in a spin-boson model. The algorithm proposed allows the correction of two errors per error correction cycle, provided that one of them is time-correlated. The algorithm can be applied to any stabilizer code, perfect or non-perfect, and simplified the circuit complexity significantly comparing to the classic quantum error correction codes.
|
3 |
Neoclassical theory versus new economic geography. Competing explanations of cross-regional variation in economic developmentFingleton, Bernard, Fischer, Manfred M. 11 1900 (has links) (PDF)
This paper uses data for 255 NUTS-2 European regions over the period 1995-2003
to test the relative explanatory performance of two important rival theories seeking to explain
variations in the level of economic development across regions, namely the neoclassical
model originating from the work of Solow (1956) and the so-called Wage Equation, which is
one of a set of simultaneous equations consistent with the short-run equilibrium of new
economic geography (NEG) theory, as described by Fujita, Krugman and Venables (1999).
The rivals are non-nested, so that testing is accomplished both by fitting the reduced form
models individually and by simply combining the two rivals to create a composite model in an
attempt to identify the dominant theory. We use different estimators for the resulting panel
data model to account variously for interregional heterogeneity, endogeneity, and temporal
and spatial dependence, including maximum likelihood with and without fixed effects, two
stage least squares and feasible generalised spatial two stage least squares plus GMM; also
most of these models embody a spatial autoregressive error process. These show that the
estimated NEG model parameters correspond to theoretical expectation, whereas the
parameter estimates derived from the neoclassical model reduced form are sometimes
insignificant or take on counterintuitive signs. This casts doubt on the appropriateness of
neoclassical theory as a basis for explaining cross-regional variation in economic
development in Europe, whereas NEG theory seems to hold in the face of competition from
its rival. (authors' abstract)
|
Page generated in 0.0652 seconds