Академический Документы
Профессиональный Документы
Культура Документы
Step 1
The first predictor variable is selected in the same way as in forward selection. If
the probability associated with the test of significance is less than or equal to the
default .05, the predictor variable with the largest correlation with the criterion
variable enters the equation first.
Step 2
The second variable is selected based on the highest partial correlation. If it can
pass the entry requirement (PIN=.05), it also enters the equation.
Step 3
From this point, stepwise selection differs from forward selection: the variables
already in the equation are examined for removal according to the removal
criterion (POUT=.10) as in backward elimination.
Step 4
Variables not in the equation are examined for entry. Variable selection ends
when no more variables meet entry and removal criteria.
In a hierarchical multiple regression, the researcher decides not only how many
predictors to enter but also the order in which they enter. Usually, the order of entry is
based on logical or theoretical considerations.
In a stepwise multiple regression analysis, the number of predictors to be selected and the
order of entry are both decided by statistical criteria (e.g., entry or removal criterion).
First, select Y as the Dependent variable. Second, move the three predictor variables to
the Independent(s) list. Third, click on the down arrow and select the Stepwise Method.
Last, click on Statistics. Choose R squared change, Descriptives, Part and partial
correlations. Click Continue. Click OK.
SPSS Printout
Which predictor variable has the largest correlation with the criterion variable Y?
Answer:
X1 has the largest correlation with the criterion variable Y, r = .858, p = .001.
Large correlations between the predictor variables can substantially affect the
results of multiple regression analysis. Note that the correlation between X2 and
X3 equals .804.
X1 has the largest correlation with the criterion variable, p < .05. The predictor variable
X1 is the first predictor to be entered into the regression equation.
• Model Summary
What is the proportion of the variation in the criterion variable Y explained by the
regression model with one predictor X1?
Answer:
About 74 % of the variation in the criterion variable Y can be explained by the regression
model with one predictor X1.
• Coefficients.
Answer:
Answer:
Answer:
The t value is 4.729. The observed significance level associated with X1 is .001.
The regression coefficient associated with X1 is significantly different from zero.
Examine the absolute values of the partial correlations for variables not in the equation.
Model 1: This is the first step of the stepwise regression in which only one predictor, X1,
is used to predict Y. Recall that X1 has the largest correlation with the criterion variable.
The two predictor variables, X2 and X3, are excluded from model 1.
Beta In: These are the standardized regression coefficients for each predictor, should
they be added to the regression equation. The beta value associated with X2 is larger
(.535). It indicates the predictor X2 would make the greater contribution of the two
excluded predictors.
Partial Correlation:
The partial correlation between X2 and Y is .849 after the effect of X1 was removed from
both X2 and Y. The observed significance level associated with X2 is .004, which passes
the entry requirement (p < .05).
The partial correlation between X3 and Y is .452 after the effect of X1 was removed from
both X3 and Y. The observed significance level associated with X3 is .222, which does
not pass the entry requirement (p > .05).
Decision
The predictor variable X2 has the largest partial correlation. The observed significance
level associated with X2 is .004, which passes the entry requirement (p < .05). The
second predictor variable to be entered into the equation will be X2.
Model 2: Regression Equation with Two Predictor Variables
What is the proportion of the variation in the criterion variable Y explained by the
regression model with two predictors X1 and X2?
About 93% of the variation in the criterion variable Y can be explained by the regression
model with two predictors, X1 and X2.
The adjusted corrected for the number of predictors equals .905. The difference
between the obtained and adjusted R square is small in our case.
R square may be overestimated when the data sets have few cases (n) relative
to number of predictors (k). Adjusted R square can be computed as
Data sets with few cases relative to number of predictors will have a
greater difference between the obtained and adjusted R square.
• Test of Significance
Test of
Is the regression model with two predictors (X1 and X2) significantly related to the
criterion variable Y?
The regression model with two predictors (X1 and X2) is significantly related to the
criterion variable Y, F(2,7) = 44,073, p < .01.
X1 and X2 account for about 93% of the variance in the criterion variable Y and that this
finding is statistically significant.
(1) About 74 % of the variation in the criterion variable Y can be explained by the
regression model with one predictor X1.
(2) About 93% of the variation in the criterion variable Y can be explained by the
regression model with two predictors, X1 and X2.
(3) An additional 19% of the variance in the criterion variable Y is contributed by X2.
Coefficients
Note that the value of B weight associated with each predictor is influenced by all other
predictors in the regression equation.
3. Part Correlation
The predictor X1 entered the regression equation first and the predictor X2 entered the
regression equation next. Recall that an additional 19% of the variance in the criterion
variable Y is contributed by X2. The signed square root of the R square change is called
the semi-partial correlation or the part correlation. The semi-partial or part correlation
between X2 and Y after removing the effect of X1 from X2 is .436.
• Test the regression coefficients of the entered variables. Both are significantly
different from zero.
This is the second step of the stepwise regression in which two predictors, X1 and X2, are
used to predict Y and the predictor variable, X3, is excluded from model 2. Note that the
observed significance level associated with X3 is .158, which is too large for entry (p > .
05).
Decision: X3 will not be included. The best regression equation will be the equation that
contains two predictor variables, X1 and X2.
Reason: Since predictor variables X2 and X3 are highly correlated (r = .804), X3 adds
relatively little in prediction when X2 is in the regression equation.
Reading
Discussion