Return to search

Additivity of component regression equations when the underlying model is linear

This thesis is concerned with the theory of fitting models of the
form y = Xβ + ε, where some distributional assumptions are made on ε.
More specifically, suppose that y[sub=j] = Zβ[sub=j] + ε [sub=j] is a model for a component
j (j = 1, 2, ..., k) and that one is interested in estimation and interference theory relating to y[sub=T] = Σ [sup=k; sub=j=1] y[sub=j] = Xβ[sub=T] + ε[sub=T].
The theory of estimation and inference relating to the fitting of y[sub=T] is considered within the general framework of general linear model theory. The consequence of independence and dependence of the y[sub=j] (j = 1, 2, ..., k) for estimation and inference is investigated. It is shown that under the assumption of independence of the y[sub=j], the parameter vector of the total equation can easily be obtained by adding corresponding components of the estimates for the parameters of the component models. Under dependence, however, this additivity property seems to break down. Inference theory under dependence is much less tractable than under independence
and depends critically, of course, upon whether y[sub=T] is normal or not.
Finally, the theory of additivity is extended to classificatory models encountered in designed experiments. It is shown, however, that additivity does not hold in general in nonlinear models. The problem of additivity does not require new computing subroutines for estimation and inference in general in those cases where it works. / Forestry, Faculty of / Graduate
Date January 1983
CreatorsChiyenda, Simeon Sandaramu
PublisherUniversity of British Columbia
Source SetsUniversity of British Columbia
Detected LanguageEnglish
TypeText, Thesis/Dissertation
RightsFor non-commercial purposes only, such as research, private study and education. Additional conditions apply, see Terms of Use

Page generated in 0.003 seconds