You are on page 1of 5

# 1

2
3
4
Assumptions for multiple regression analysis include:
The sample is representative of the population for the inference prediction.
The error is a random variable with a mean of zero conditional on the explanatory
variables.
The independent variables are measured with no error. (Note: If this is not so, modeling
may be done instead using errors-in-variables model techniques).
The independent variables (predictors) are linearly independent, i.e. it is not possible to
express any predictor as a linear combination of the others.
The errors are uncorrelated, that is, the variancecovariance matrix of the errors
is diagonal and each non-zero element is the variance of the error.
The variance of the error is constant across observations (homoscedasticity). If
not, weighted least squares or other methods might instead be used.
Partial Vs Multiple Correlation
1. In multiple correlation three or more variables are studied simultaneously. In partial
correlation we recognize three or more variables but make correlation of two variables
from the series.
2. In partial correlation the statistic is denoted by r where as in multiple correlation the
statistic is denoted as R.
3. The coefficients of partial correlation (first order) are r12.3, r13.2 and r23.1 while the
coefficients of multiple correlation are R1.23, R.2.13 and R3.12
4. A partial correlation is a type of Pearson correlation coefficient that can range in value
from -1 to +1 on the other hand multiple correlation coefficient ranges from 0 to 1.
5. If the partial correlation approaches 0, the inference is that there is no direct causal link
between the two original variables on the other hand when R approaches 0, this indicates
that the regression plane poorly predicts the dependent variable.