Академический Документы
Профессиональный Документы
Культура Документы
REGRESSION
Introduction
Simple linear regression is a statistical method that allows us to summarize and study
relationships between two continuous (quantitative) variables. This lesson introduces the concept
and basic procedures of simple linear regression. We will also learn two measures that describe
the strength of the linear association that we find in data.
Simple linear regression is a statistical method that allows us to summarize and study
relationships between two continuous (quantitative) variables:
1
Because the other terms are used less frequently today, we'll use the "predictor" and "response"
terms to refer to the variables encountered in this course. The other terms are mentioned only to
make you aware of them should you encounter them in other arenas. Simple linear regression
gets its adjective "simple," because it concerns the study of only one predictor variable. In
contrast, multiple linear regression, which we study later in this course, gets its adjective
"multiple," because it concerns the study of two or more predictor variables.
Types of relationships
Before proceeding, we must clarify what types of relationships we won't study in this
course, namely, deterministic (or functional) relationships. Here is an example of a
deterministic relationship.
Note that the observed (x, y) data points fall directly on a line. As you may remember, the
relationship between degrees Fahrenheit and degrees Celsius is known to be:
Fahr =95Cels+32
That is, if you know the temperature in degrees Celsius, you can use this equation to
determine the temperature in degrees Fahrenheit exactly.
Here are some examples of other deterministic relationships that students from previous
semesters have shared:
Circumference = π × diameter
Hooke's Law: Y = α + βX, where Y = amount of stretch in a spring, and X = applied
weight.
2
Ohm's Law: I = V/r, where V = voltage applied, r = resistance, and I = current.
Boyle's Law: For a constant temperature, P = α/V, where P = pressure, α = constant for
each gas, and V = volume of gas.
For each of these deterministic relationships, the equation exactly describes the
relationship between the two variables. This course does not examine deterministic relationships.
Instead, we are interested in statistical relationships, in which the relationship between the
variables is not perfect.
You might anticipate that if you lived in the higher latitudes of the northern U.S., the less
exposed you'd be to the harmful rays of the sun, and therefore, the less risk you'd have of death
due to skin cancer. The scatter plot supports such a hypothesis. There appears to be a negative
linear relationship between latitude and mortality due to skin cancer, but the relationship is not
perfect. Indeed, the plot exhibits some "trend," but it also exhibits some "scatter." Therefore, it
is a statistical relationship, not a deterministic one.
Height and weight — as height increases, you'd expect weight to increase, but not
perfectly.
3
Alcohol consumed and blood alcohol content — as alcohol consumption increases, you'd
expect one's blood alcohol content to increase, but not perfectly.
Vital lung capacity and pack-years of smoking — as amount of smoking increases (as
quantified by the number of pack-years of smoking), you'd expect lung function (as
quantified by vital lung capacity) to decrease, but not perfectly.
Driving speed and gas mileage — as driving speed increases, you'd expect gas mileage to
decrease, but not perfectly.
Okay, so let's study statistical relationships between one response variable y and one
predictor variable x!
Introduction
In this lesson, we make our first (and last?!) major jump in the course. We move from the
simple linear regression model with one predictor to the multiple linear regression model with
two or more predictors. That is, we use the adjective "simple" to denote that our model has only
predictor, and we use the adjective "multiple" to indicate that our model has at least two
predictors.
In the multiple regression setting, because of the potentially large number of predictors, it
is more efficient to use matrices to define the regression model and the subsequent analyses. This
lesson considers some of the more important multiple regression formulas in matrix form. If
you're unsure about any of this, it may be a good time to take a look at this Matrix Algebra
Review.
The good news is that everything you learned about the simple linear regression model
extends — with at most minor modification — to the multiple linear regression model. Think
about it — you don't have to forget all of that good stuff you learned! In particular:
The models have similar "LINE" assumptions. The only real difference is that whereas in
simple linear regression we think of the distribution of errors at a fixed value of the single
predictor, with multiple linear regression we have to think of the distribution of errors at a
fixed set of values for all the predictors. All of the model checking procedures we learned
earlier are useful in the multiple linear regression framework, although the process
becomes more involved since we now have multiple predictors. We'll explore this issue
further in Lesson 7.
The use and interpretation of r2 (which we'll denote R2 in the context of multiple linear
regression) remains the same. However, with multiple linear regression we can also make
use of an "adjusted" R2 value, which is useful for model building purposes. We'll explore
this measure further in Lesson 10.
4
With a minor generalization of the degrees of freedom, we use t-tests and t-intervals for
the regression slope coefficients to assess whether a predictor is significantly linearly
related to the response, after controlling for the effects of all the opther predictors in the
model.
With a minor generalization of the degrees of freedom, we use prediction intervals for
predicting an individual response and confidence intervals for estimating the mean
response. We'll explore these further in Lesson 7.
Know how to calculate a confidence interval for a single slope parameter in the multiple
regression setting.
Be able to interpret the coefficients of a multiple regression model.
Understand what the scope of the model is in the multiple regression model.
Understand the calculation and interpretation of R2 in a multiple regression setting.
Understand the calculation and use of adjusted R2 in a multiple regression setting.