Академический Документы
Профессиональный Документы
Культура Документы
Alok Srivastava
Robinson College of Business,
Georgia State University
Module 3
Regression
Dependent variable
B1 = slope
= y/ x
b0 (y intercept)
Observation: y
Zero
Independent variable (x)
The function will make a prediction for each observed data point.
The observation is denoted by y and prediction is denoted by ^y.
Simple Linear Regression
Prediction error:
Observation: y
Prediction: y^
Zero
y = y + ^
Actual = Explained + Error
Simple Linear Regression
y
= x
Regression
Dependent variable
A least squares regression selects the line with the lowest total sum
of squared prediction errors.
This value is called the Sum of Squares of Error, or SSE.
Calculating SSR
Mathematically,
SSR SSR
2
R = SST = SSR + SSE
Standard Error =
SSE
n-k
y = A + X 1+ 1 X 2+ 2 + k Xk +