Академический Документы
Профессиональный Документы
Культура Документы
S (X - X)(Y - Y)/(n - 1)
i=1
where:
n = sample size
Xi = ith observation of Variable X
X = mean observation of Variable X
Yi = ith observation of Variable Y
Y = mean observation of Variable Y
Sample variance = sX =
Cov (X,Y)
sXsY
S (X - X) /(n - 1)
i=1
sX
Test statistic
Test-stat = t =
n-2
1 - r2
Where:
n = Number of observations
r = Sample correlation
Linear Regression with One Independent Variable
Regression model equation = Yi = b0 + b1Xi + ei, i = 1,...., n
QUANTITATIVE METHODS
S [Y - (b + b X )]
i=1
1 i
where:
Yi = Actual value of the dependent variable
b 0 + b 1Xi = Predicted value of dependent variable
The Standard Error of Estimate
SEE =
i=1
(Yi - b 0 - b 1Xi)2
) ( )
n
1/2
n-2
S (e )
i=1
1/2
n-2
SSE
n-2
1/2
Explained variation
Total variation
Unexplained variation
=1-
Total variation
RSS =
S (Y^ - Y )
i=1
g Explained variation
SSE =
S (Y - Y^ )
i=1
g Unexplained variation
QUANTITATIVE METHODS
ANOVA Table
Source of Variation
Degrees of Freedom
Sum of Squares
RSS
n - (k + 1)
SSE
n-1
SST
Regression (explained)
Error (unexplained)
Total
Prediction Intervals
2
sf
=s
Y^ tc sf
1 (X - X)2
1+ +
n (n - 1) s 2
x
RSS
k
MSE =
RSS
1
SSE
n-2
= RSS
Residual Term
ei = Yi - Yi = Yi - (b0 + b 1X1i + b 2X2i + . . .+ b k Xki)
Confidence Intervals
bj (tc sbj)
estimated regression coefficient (critical t-value)(coefficient standard error)
F-statistic
F-stat =
MSR
RSS/k
=
MSE
SSE/[n - (k + 1)]
R2 and Adjusted R2
R2 =
Adjusted R2 = R2 = 1 -
n-1
n-k-1
SST - SSE
SST
RSS
SST
(1 - R2)
Do not Reject
H0
Inconclusive
du
dl
Inconclusive
4 - du
Reject H0,
conclude
Negative Serial
Correlation
4 - dl
Effect
Solution
Heteroskedasticity
Serial correlation
Multicollinearity
t = 1, 2, . . . , T
where:
yt = the value of the time series at time t (value of the dependent variable)
b0 = the y-intercept term
b1 = the slope coefficient/ trend coefficient
t = time, the independent or explanatory variable
et = a random-error term
TIME-SERIES ANALYSIS
Linear Trend Models
yt = b0 + b1t + et,
t = 1, 2, . . . , T
where:
yt = the value of the time series at time t (value of the dependent variable)
b0 = the y-intercept term
b1 = the slope coefficient/ trend coefficient
t = time, the independent or explanatory variable
et = a random-error term
Log-Linear Trend Models
A series that grows exponentially can be described using the following equation:
yt = eb0 + b1t
where:
yt = the value of the time series at time t (value of the dependent variable)
b0 = the y-intercept term
b1 = the slope coefficient
t = time = 1, 2, 3 ... T
We take the natural logarithm of both sides of the equation to arrive at the equation for the loglinear model:
ln yt = b0 + b1t + et,
t = 1,2, . . . , T
where:
Standard error of residual autocorrelation = 1/ T
T = Number of observations in the time series
Mean Reversion
xt =
b0
1 - b1