Вы находитесь на странице: 1из 8

THE THREE-VARIABLE MODEL:

NOTATION AND ASSUMPTIONS


Generalizing the two-variable population
regression function (PRF), we may write
the three-variable PRF as

We continue to operate within the


where Y is the dependent variable, X2
framework of the classical linear
and X3 the explanatory variables
regression model (CLRM), weassume the
(orregressors), u the stochastic
following:
disturbance term, and i the ith
observation; incase the data are time • Zero mean value of ui, or E (ui | X2i, X3i) =
series, the subscript t will denote the tth 0 for each i
observation.In Eqβ1 is the intercept term. • No serial correlation, or cov (ui, uj) = 0 i j
As usual, it gives the mean or average • Homoscedasticity, orvar (ui) = σ 2
effect on Y of all the variables excluded • Zero covariance between ui and each X
from the model, although its mechanical variable, orcov (ui, X2i) = cov (ui, X3i) = 0
interpretation is the average value of Y • No specification bias, orThe model is
when X2 and X3 are set equal to zero. correctly specified
The coefficients β2 and β3 are called the
• No exact collinearity between the X
partial regression coefficients, and their
variables, orNo exact linear relationship
meaning will be explained shortly.
between X2 and X3
INTERPRETATION OF MULTIPLE • THE MEANING OF PARTIAL
REGRESSION EQUATION REGRESSION COEFFICIENTS

• As mentioned earlier, the regression


coefficients β2 and β3 are known as partial
Given the assumptions of the classical
regression or partial slope coefficients. The
regressiol model, it follows that, ontaking
meaning of partial regression coefficient is
the conditional expectation of Y on both
as follows: β2 measures the change in the
sides, we obtainE(Yi | X2i, X3i) = β1 +
mean valueof Y, E(Y), per unit change in
β2X2i + β3i X3iIn words, gives the X2, holding the value of X3 constant. Put
conditional mean or expected value of Y differently, it gives the “direct” or the “net”
conditional upon the given or fixed effect of a unit change in X2 on themean
values of X2 and X3. Therefore, as in value of Y, net of any effect that X3 may
thetwo-variable case, multiple regression have on mean Y. Likewise, β3measures the
analysis is regression analysis change in the mean value of Y per unit
conditional upon the fixed values of the change in X3, holdingthe value of X2
regressors, and what we obtain is constant.4 That is, it gives the “direct” or
theaverage or mean value of Y or the “net” effect of a unitchange in X3 on the
mean response of Y for the given values mean value of Y, net of any effect that X 2
ofthe regressors. may have onmean Y.
OLS AND ML ESTIMATION OF THE PARTIAL REGRESSION COEFFICIENTS

• OLS Estimators • where the expression for the RSS


To find the OLS estimators, let us is obtained by simple algebraic
first write the sample regression manipulations. The most
function(SRF) corresponding to the straightforward procedure to
PRF of as follows: Yi = βˆ1 + βˆ2X2i + obtain the estimators that
βˆ3X3i + ˆui where uˆi is the residual willminimize is to differentiate it
term, the sample counterpart of the with respect to the unknowns, set
stochastic disturbance term ui. As
theresulting expressions to zero,
noted in Chapter 3, the OLS
procedure consists in so choosing the
and solve them simultaneously.
values of the unknown parameters As shown inAppendix 7A,
that the residual sum of squares Section 7A.1, this procedure
(RSS) is as small as possible. gives the following normal
Symbolically, equations.
Where r2 3 is the sample coefficient of correlation between X2 and X3 as defined in
Chapter 3.

Variances and Standard Errors of OLS Estimators

• Having obtained the OLS estimators of the Or, equivalently


partial regression coefficients,we can derive the
variances and standard errors of these estimators
in themanner indicated in Appendix 3A.3. As in
the two-variable case, we need thestandard
errors for two main purposes: to establish
confidence intervals andto test statistical
hypotheses. The relevant formulas are as
follows:
Properties of OLS Estimators

• The properties of OLS estimators of the multiple regression model parallelthose of


the two-variable model. Specifically:
• The three-variable regression line (surface) passes through the means Y¯, X¯ 2, and
X¯3, which is evident from of the two variable model]. This property holds generally.
Thus in the k-variable linearregression model [a regressand and (k − 1) regressors]:
Yi = β1 + β2X2i + β3X3i +···+ βkXki + ui, we haveβˆ1 = Y¯ − β2X¯ 2 − β3Xˆ 3 −···− βkX¯k.
• The mean value of the estimated Yi ( = Yˆi) is equal to the mean valueof the actual
Yi, which is easy to prove:Yˆi = βˆ1 + βˆ2X2i + βˆ3X3i= (Y¯ − βˆ2X¯2 − βˆ3X¯3) + βˆ2X2i +
βˆ3X3i (Why?)= Y¯ + βˆ2(X2i − X¯2) + βˆ3(X3i − X¯3) = Y¯ + βˆ2x2i + βˆ3x3i. where as usual
small letters indicate values of the variables as deviationsfrom their respective
means.Summing both sides over the sample values and dividingthrough by the
sample size n gives ¯Yˆ = Y¯. (Note: x2i = x3i = 0. Why?) Notice that by virtue, we can
writeyˆi = βˆ2x2i + βˆ3x3i, where yˆi = (Yˆi − Y¯).Therefore, the SRF can be expressed in
the deviation form asyi = ˆyi + ˆui = βˆ2x2i + βˆ3x3i + ˆui.= ¯uˆ = 0, which can be verified
from [Hint: Sum bothsides of over the sample values.]
• The residuals uˆi are uncorrelated with X2i and X3i, that is, == 0 (see Appendix 7A.1 for proof).
• The residuals uˆi are uncorrelated with Yˆi; that is, uˆiYˆi = 0. Why?[Hint: Multiplyon both
sides by uˆi and sum over the sample values.]
• From it is evident that as r2 3, the correlationcoefficient between X2 and X3, increases toward 1,
the variances of βˆ2 andβˆ3 increase for given values of σ 2 and In the limit, whenr2 3 = 1 (i.e.,
perfect collinearity), these variances become infinite. The implications of this will be explored
fully in Chapter 10, but intuitively thereader can see that as r2 3increases it is going to be
increasingly difficult toknow what the true values of β2 and β3 are. [More on this in the next
chapter, but refer to Eq.
• It is also clear from eg that for given values of r2 3and the variances of the OLS estimators are
directly proportional to σ 2; that is, they increase as σ 2increases. Similarly, for given valuesof
σ 2and r2 3, the variance of βˆ2 is inversely proportional to that is, thegreater the variation in the
sample values of X2, the smaller the variance ofβˆ2 and therefore β2 can be estimated more
precisely. A similar statement canbe made about the variance of βˆ3.
• Given the assumptions of the classical linear regression model, which are spelled out in
Section, one can prove that the OLS estimators ofthe partial regression coefficients not only
are linear and unbiased but alsohave minimum variance in the class of all linear unbiased
estimators. Inshort, they are BLUE: Put differently, they satisfy the Gauss-Markov theorem.
(The proof parallels the two-variable case proved in Appendix 3A,Section 3A.6 and will be
presented more compactly using matrix notation inAppendix C.)

Вам также может понравиться