Вы находитесь на странице: 1из 8

QUANTITATIVE METHODS

CORRELATION AND REGRESSION


n

Sample covariance = Cov (X,Y) =

S (X - X)(Y - Y)/(n - 1)
i=1

where:
n = sample size
Xi = ith observation of Variable X
X = mean observation of Variable X
Yi = ith observation of Variable Y
Y = mean observation of Variable Y

Sample correlation coefficient = r =

Sample variance = sX =

Cov (X,Y)
sXsY

S (X - X) /(n - 1)
i=1

Sample standard deviation = sX =

sX

Test statistic

Test-stat = t =

n-2
1 - r2

Where:
n = Number of observations
r = Sample correlation
Linear Regression with One Independent Variable
Regression model equation = Yi = b0 + b1Xi + ei, i = 1,...., n

b1 and b0 are the regression coefficients.


b1 is the slope coefficient.
b0 is the intercept term.
e is the error term that represents the variation in the dependent variable that
is not explained by the independent variable.

2014 ELAN GUIDES

QUANTITATIVE METHODS

Regression line equation = Yi = b0 + b1Xi , i = 1,...., n


Regression Residuals
n

S [Y - (b + b X )]
i=1

1 i

where:
Yi = Actual value of the dependent variable
b 0 + b 1Xi = Predicted value of dependent variable
The Standard Error of Estimate

SEE =

i=1

(Yi - b 0 - b 1Xi)2

) ( )
n

1/2

n-2

S (e )
i=1

1/2

n-2

SSE
n-2

1/2

The Coefficient of Determination


Total variation = Unexplained variation + Explained variation
R2 =

Explained variation
Total variation

Total variation - Unexplained variation


Total variation

Unexplained variation

=1-

Total variation

Hypothesis Tests on Regression Coefficients


CAPM: RABC = RF + bABC(RM RF)
RABC RF = a + bABC(RM RF) + e

The intercept term for the regression, b0, is a.


The slope coefficient for the regression, b1, is bABC

The regression sum of squares (RSS)


n

RSS =

S (Y^ - Y )
i=1

g Explained variation

The sum of squared errors or residuals (SSE)


n

SSE =

S (Y - Y^ )
i=1

g Unexplained variation

2014 ELAN GUIDES

QUANTITATIVE METHODS

ANOVA Table
Source of Variation

Degrees of Freedom

Sum of Squares

RSS

n - (k + 1)

SSE

n-1

SST

Regression (explained)

Error (unexplained)
Total

k = the number of slope coefficients in the regression.

Prediction Intervals

2
sf

=s

Y^ tc sf

2014 ELAN GUIDES

1 (X - X)2
1+ +
n (n - 1) s 2
x

Mean Sum of Squares


MSR =

RSS
k

MSE =

RSS
1
SSE
n-2

= RSS

MULTIPLE REGRESSION AND ISSUES IN REGRESSION

MULTIPLE REGRESSION AND ISSUES IN REGRESSION ANALYSIS


Multiple regression equation
Multiple regression equation = Yi = b0 + b1X1i + b2X2i + . . .+ bk Xki + ei, i = 1,2, . . . , n
Yi
Xji
b0
b1, . . . , bk
ei
n

= the ith observation of the dependent variable Y


= the ith observation of the independent variable Xj , j = 1,2, . . . , k
= the intercept of the equation
= the slope coefficients for each of the independent variables
= the error term for the ith observation
= the number of observations

Residual Term
ei = Yi - Yi = Yi - (b0 + b 1X1i + b 2X2i + . . .+ b k Xki)
Confidence Intervals
bj (tc sbj)
estimated regression coefficient (critical t-value)(coefficient standard error)

F-statistic
F-stat =

MSR
RSS/k
=
MSE
SSE/[n - (k + 1)]

R2 and Adjusted R2

R2 =

Total variation - Unexplained variation


Total variation

Adjusted R2 = R2 = 1 -

n-1
n-k-1

SST - SSE
SST

RSS
SST

(1 - R2)

Testing for Heteroskedasticity- The Breusch-Pagan (BP) Test


c2 = nR2 with k degrees of freedom
n = Number of observations
R2 = Coefficient of determination of the second regression (the regression when the squared
residuals of the original regression are regressed on the independent variables).
k = Number of independent variables

2014 ELAN GUIDES

MULTIPLE REGRESSION AND ISSUES IN REGRESSION

Testing for Serial Correlation- The Durban-Watson (DW) Test


DW 2(1 r); where r is the sample correlation between squared residuals from one period and
those from the previous period.
Value of Durbin-Watson Statistic
(H0: No serial correlation)
Reject H0,
conclude
Positive Serial
Correlation
0

Do not Reject
H0

Inconclusive
du

dl

Inconclusive

4 - du

Reject H0,
conclude
Negative Serial
Correlation
4 - dl

Problems in Linear Regression and Solutions


Problem

Effect

Solution

Heteroskedasticity

Incorrect standard errors

Use robust standard errors


(corrected for conditional
heteroskedasticity)

Serial correlation

Incorrect standard errors (additional


problems if a lagged value of the
dependent variable is used as an
independent variable)

Use robust standard errors


(corrected for serial correlation)

Multicollinearity

High R2 and low t-statistics

Remove one or more independent


variables; often no solution based
in theory

Model Specification Errors


Yi = b0 + b1lnX1i + b2X2i + e
Linear Trend Models
yt = b0 + b1t + et,

t = 1, 2, . . . , T

where:
yt = the value of the time series at time t (value of the dependent variable)
b0 = the y-intercept term
b1 = the slope coefficient/ trend coefficient
t = time, the independent or explanatory variable
et = a random-error term

2014 ELAN GUIDES

TIME SERIES ANALYSIS

TIME-SERIES ANALYSIS
Linear Trend Models
yt = b0 + b1t + et,

t = 1, 2, . . . , T

where:
yt = the value of the time series at time t (value of the dependent variable)
b0 = the y-intercept term
b1 = the slope coefficient/ trend coefficient
t = time, the independent or explanatory variable
et = a random-error term
Log-Linear Trend Models
A series that grows exponentially can be described using the following equation:
yt = eb0 + b1t
where:
yt = the value of the time series at time t (value of the dependent variable)
b0 = the y-intercept term
b1 = the slope coefficient
t = time = 1, 2, 3 ... T
We take the natural logarithm of both sides of the equation to arrive at the equation for the loglinear model:
ln yt = b0 + b1t + et,

t = 1,2, . . . , T

AUTOREGRESSIVE (AR) TIME-SERIES MODELS


xt = b0 + b1xt - 1 + et
A pth order autoregressive model is represented as:
xt = b0 + b1xt - 1 + b2xt - 2+ . . . + bpxt - p + et
Detecting Serially Correlated Errors in an AR Model
t-stat =

Residual autocorrelation for lag


Standard error of residual autocorrelation

where:
Standard error of residual autocorrelation = 1/ T
T = Number of observations in the time series

2014 ELAN GUIDES

TIME SERIES ANALYSIS

Mean Reversion

xt =

b0
1 - b1

Multiperiod Forecasts and the Chain Rule of Forecasting


^x = ^b + ^b x
t+1
0
1 t
Random Walks
xt = xt - 1 + et , E(et) = 0, E(et2) = s2, E(etes) = 0 if t s
The first difference of the random walk equation is given as:
yt = xt - xt - 1 = xt - 1 + et - xt - 1= et , E(et) = 0, E(et2) = s2, E(etes) = 0 for t s
Random Walk with a Drift
xt = b0 + b1xt - 1 + et
b1 = 1, b0 0, or
xt = b0 + xt - 1 + et , E(et) = 0
The first-difference of the random walk with a drift equation is given as:
yt = xt - xt - 1 , yt = b0 + et , b0 0
The Unit Root Test of Nonstationarity
xt = b0 + b1xt - 1 + et
xt - xt - 1 = b0 + b1xt - 1 - xt - 1 + et
xt - xt - 1 = b0 + (b1 - 1)xt - 1 + et
xt - xt - 1 = b0 + g1xt - 1 + et

2014 ELAN GUIDES

TIME SERIES ANALYSIS

Autoregressive Moving Average (ARMA) Models


xt = b0 + b1xt - 1 + . . . + bpxt - p + et + q1et - 1 +. . . + qqet - q
E(et) = 0, E(et2) = s2, E(etes) = 0 for t s

Autoregressive Conditional Heteroskedasticity Models (ARCH Models)


^e 2 = a + a ^e 2 + u
t
0
1 t -1
t
The error in period t+1 can then be predicted using the following formula:
^s 2 = a^ + a^ e^ 2
t+1
0
1 t

2014 ELAN GUIDES

Вам также может понравиться