Вы находитесь на странице: 1из 4

Exam

Name___________________________________

MULTIPLE CHOICE. Choose the one alternative that best completes the statement or answers the question. Be
sure to write your answers in the table provided above
1) In the simple linear regression model, the regression slope
A) represents the elasticity of Y on X.
B) indicates by how many units Y increases, given a one unit increase in X.
C) indicates by how many percent Y increases, given a one percent increase in X.
D) when multiplied with the explanatory variable will give you the predicted Y.
2) Under the least squares assumptions (zero conditional mean for the error term, Xi and Yi being i.i.d., and
Xi and ui having finite fourth moments), the OLS estimator for the slope and intercept
A) is BLUE.
B) has an exact normal distribution for n > 15.
C) has a normal distribution even in small samples.
D) is unbiased.
3) To decide whether or not the slope coefficient is large or small,
A) the slope coefficient must be statistically significant.
B) the slope coefficient must be larger than one.
C) you should change the scale of the X variable if the coefficient appears to be too small.
D) you should analyze the economic importance of a given increase in X.
4) E(ui | Xi) = 0 says that
A) the sample mean of the Xs is much larger than the sample mean of the errors.
B) dividing the error by the explanatory variable results in a zero (on average).
C) the sample regression function residuals are unrelated to the explanatory variable.
D) the conditional distribution of the error given the explanatory variable has a zero mean.
5) The confidence interval for the sample regression function slope
A) allows you to make statements about the economic importance of your estimate.
B) can be used to compare the value of the slope relative to that of the intercept.
C) adds and subtracts 1.96 from the slope.
D) can be used to conduct a test about a hypothesized population regression function slope.
6) The t-statistic is calculated by dividing
A) the OLS estimator by its standard error.
B) the slope by 1.96.
C) the estimator minus its hypothesized value by the standard error of the estimator.
D) the slope by the standard deviation of the explanatory variable.
7) Multiplying the dependent variable by 100 and the explanatory variable by 100,000 leaves the
A) heteroskedasticity-robust standard errors of the OLS estimators the same.
B) regression R2 the same.
C) OLS estimate of the intercept the same.
D) OLS estimate of the slope the same.

8) If you had a two regressor regression model, then omitting one variable which is relevant
A) makes the sum of the product between the included variable and the residuals different from 0.
B) will have no effect on the coefficient of the included variable if the correlation between the excluded
and the included variable is negative.
C) can result in a negative value for the coefficient of the included variable, even though the coefficient
will have a significant positive effect on Y if the omitted variable were included.
D) will always bias the coefficient of the included variable upwards.
9) Under imperfect multicollinearity
A) the OLS estimator is biased even in samples of n > 100.
B) the OLS estimator cannot be computed.
C) two or more of the regressors are highly correlated.
D) the error terms are highly, but not perfectly, correlated.
10) The main advantage of using multiple regression analysis over differences in means testing is that the
regression technique
A) gives you quantitative estimates of a unit change in X.
B) assumes that the error terms are generated from a normal distribution.
C) allows you to calculate p-values for the significance of your results.
D) provides you with a measure of your goodness of fit.
11) The sample regression line estimated by OLS
A) has an intercept that is equal to zero.
B) cannot have negative and positive slopes.
C) is the line that minimizes the sum of squared prediction mistakes.
D) is the same as the population regression line.
12) The OLS estimator is derived by
A) minimizing the sum of squared residuals.
B) minimizing the sum of absolute residuals.
C) making sure that the standard error of the regression equals the standard error of the slope estimator.
D) connecting the Yi corresponding to the lowest Xi observation with the Yi corresponding to the highest
Xi observation.
13) With heteroskedastic errors, the weighted least squares estimator is BLUE. You should use OLS with
heteroskedasticity-robust standard errors because
A) the exact form of the conditional variance is rarely known.
B) the Gauss-Markov theorem holds.
C) this method is simpler.
D) our spreadsheet program does not have a command for weighted least squares.
14) If you reject a joint null hypothesis using the F-test in a multiple hypothesis setting, then
A) the regression is always significant.
B) the F-statistic must be negative.
C) all of the hypotheses are always simultaneously rejected.
D) a series of t-tests may or may not give you the same conclusion.

15) Binary variables


A) exclude certain individuals from your sample.
B) are generally used to control for outliers in your sample.
C) can take on only two values.
D) can take on more than two values.

Answer Key
Testname: PS3.TST

MULTIPLE CHOICE. Choose the one alternative that best completes the statement or answers the question. Be
sure to write your answers in the table provided above
1)
2)
3)
4)
5)
6)
7)
8)
9)
10)
11)
12)
13)
14)
15)

B
D
D
D
D
C
B
C
C
A
C
A
A
D
C

Вам также может понравиться