Вы находитесь на странице: 1из 23

Correlation and Regression Analysis

Many engineering design and analysis problems involve factors that are
interrelated and dependent. E.g., (1) runoff volume, rainfall; (2) evaporation,
temperature, wind speed; (3) peak discharge, drainage area, rainfall intensity;
(4) crop yield, irrigated water, fertilizer.
Due to inherent complexity of system behaviors and lack of full understanding
of the procedure involved, the relationship among the various relevant factors
or variables are established empirically or semi-empirically.
Regression analysis is a useful and widely used statistical tool dealing with
investigation of the relationship between two or more variables related in a
non-deterministic fashion.
If a variable Y is related to several variables X
1
, X
2
, , X
K
and their
relationships can be expressed, in general, as
Y = g(X
1
, X
2
, , X
K
)
where g(.) = general expression for a function;
Y = Dependent (or response) variable;
X
1
, X
2
,, X
K
= Independent (or explanatory) variables.
Correlation
When a problem involves two dependent random variables, the degree of
linear dependence between the two can be measured by the correlation
coefficient (X,Y), which is defined as

where Cov(X,Y) is the covariance between random variables X and Y defined
as


where <Cov(X,Y)< and s (X,Y) s .

Various correlation coefficients are developed in statistics for measuring the
degree of association between random variables. The one defined above is
called the Pearson product moment correlation coefficient or correlation
coefficient.

If the two random variables X and Y are independent, then (X,Y)=
Cov(X,Y)= . However, the reverse statement is not necessarily true.

Cases of Correlation
Perfectly linearly
correlated in opposite
direction
Strongly & positively
correlated in
linear fashion
Perfectly correlated in
nonlinear fashion, but
uncorrelated linearly.
Uncorrelated in
linear fashion
Calculation of Correlation Coefficient
Given a set of n paired sample observations of two random variables
(x
i
, y
i
), the sample correlation coefficient ( r) can be calculated as







Auto-correlation
Consider following daily stream flows (in 1000 m
3
) in June 2001 at Chung
Mei Upper Station (610 ha) located upstream of a river feeding to Plover Cove
Reservoir. Determine its 1-day auto-correlation coefficient, i.e., (Q
t
, Q
t+1
).







29 pairs: {(Q
t
, Q
t+1
)} = {(Q
1
, Q
2
), (Q
2
, Q
3
), , (Q
29
, Q
30
)};
Relevant sample statistics: n=29


The 1-day auto-correlation is 0.439
Day (t) Flow Q(t) Day (t) Flow Q(t) Day (t) Flow Q(t)
1 8.35 11 313.89 21 20.06
2 6.78 12 480.88 22 17.52
3 6.32 13 151.28 23 116.13
4 17.36 14 83.92 24 68.25
5 191.62 15 44.58 25 280.22
6 82.33 16 36.58 26 347.53
7 524.45 17 33.65 27 771.30
8 196.77 18 26.39 28 124.20
9 785.09 19 22.98 29 58.00
10 562.05 20 21.92 30 44.08
1
1
186.22; 230.06; 187.45; 229.17
t t
t Q t Q
Q S Q S
+
+
= = = =
Chung Mei Upper Daily Flow
10 20 30
0
100
200
300
400
500
600
700
800
Day
F
l
o
w

(
1
0
0
0

c
u
b
i
c

m
e
t
e
r
s
)
1 2 3 4 5
-1.0
-0.8
-0.6
-0.4
-0.2
0.0
0.2
0.4
0.6
0.8
1.0
A
u
t
o
c
o
r
r
e
l
a
t
i
o
n
Autocorrelation for June 2001 Daily Flows at Chung Mei Upper, HK
Time lags (Days)
0
100
200
300
400
500
600
700
800
900
0 200 400 600 800 1000
Q(t), 1000 m^3
Q
(
t
+
1
)
,

1
0
0
0

m
^
3
Regression Models
due to the presence of uncertainties a deterministic functional
relationship generally is not very appropriate or realistic.
The deterministic model form can be modified to account for
uncertainties in the model as
Y = g(X
1
, X
2
, , X
K
) + c
where c = model error term with E(c)=0, Var(c)=o
2
.

In engineering applications, functional forms commonly used for
establishing empirical relationships are
Additive: Y = |
0
+ |
1
X
1
+ |
2
X
2
+ + |
K
X
K
+c

Multiplicative: +c.

K 2 1

K

1 0
X ... X X Y =
Least Square Method
Suppose that there are n pairs of data, {(x
i
, y
i
)}, i=1, 2,.. , n and a plot of
these data appears as









What is a plausible mathematical model describing x & y relation?
x
y
Least Square Method
Considering an arbitrary straight line, y =|
0
+|
1
x, is to be fitted through these
data points. The question is Which line is the most representative?

1
|
1
|
0
x
i
x
y
i
y
i
^
y =|
0
+|
1
x
^
e
i
= y
i
y
i
= error (residual)
^
y
1
|
1
|
0
x
i
x
y
i
y
i
^
y
i
^
y =|
0
+|
1
x
^
y =|
0
+|
1
x
^
y =|
0
+|
1
x
^
e
i
= y
i
y
i
= error (residual)
^
e
i
= y
i
y
i
= error (residual)
^
y
Least Square Criterion
What are the values of |
0
and |
1
such that the resulting line best fits
the data points?

But, wait !!! What goodness-of-fit criterion to use to determine among
all possible combinations of |
0
and |
1
?

The least squares (LS) criterion states that the sum of the squares of
errors (or residuals, deviations) is minimum. Mathematically, the LS
criterion can be written as:




Any other criteria that can be used?
Normal Equations for LS Criterion
The necessary conditions for the minimum values of D are:
and




Expanding the above equations



Normal equations:
0
0
=
c
c
|
D
0
1
=
c
c
|
D
( ) | |( )
( ) | |( )

= + =
c
c
= + =
c
c

=
=
n
i
i i i
n
i
i i
x x y
D
x y
D
1
1 0
1
1
1 0
0
0 2
0 1 2
| |
|
| |
|
| |
| |

=
=

=
1 0
=
1 0
n
i
i i i
n
i
i i
x y x
x y
1
1
0
0
| |
| |

=
=


=
1
=
0
=
=
1 0
=
0
0
1
2
1 1
1 1
n
i
i
n
i
i
n
i
i i
n
i
i
n
i
i
x x y x
x n y
| |
| |

= |
.
|

\
|
+ |
.
|

\
|
= |
.
|

\
|
+



=
1
=
0
=
=
1
=
0
n
i
i i
n
i
i
n
i
i
n
i
i
n
i
i
y x x x
y x n
1 1
2
1
1 1
| |
| |

LS Solution (2 Unknowns)

=
|
.
|

\
|

=
=
|
|
|
|
.
|

\
|

|
|
|
|
.
|

\
|
=




=
=
= =
= = =
1
1 1
= =
0
2
1
2
1
2
1 1
2
1 1 1
1 1
1
1


x n x
y x n y x
x
n
x
y x
n
y x
x y
n
x
n
y
n
i
i
n
i
i i
n
i
i
n
i
i
n
i
i
n
i
i
n
i
i i
n
i
i
n
i
i
|
| | |
Fitting a Polynomial Eq. By LS Method
n i x x x y
i
k
i k i i i
, , 2 , 1 ,
2
2
= + + + + + =
1 0
c | | | |
LS criterion:
minimize D= ( ) | |

=
2 1 0
+ + + +
n
i
k
i i i i
x x x y
1
2
2
k
| | | |

k
| | , ,
0


Set
k j for
D
j
, , 2 , 1 , 0 , 0 = =
c
c
|

Normal Equations are:

= |
.
|

\
|
+ + |
.
|

\
|
+ |
.
|

\
|
=
= |
.
|

\
|
+ + |
.
|

\
|
+ |
.
|

\
|
= |
.
|

\
|
+ + |
.
|

\
|
+



= = =
+
1
=
0
= =
+
=
1
=
0
= = =
1 0
n
i
k
i i
n
i
k
i
n
i
k
i
n
i
k
i
n
i
i i
n
i
k
i
n
i
i
n
i
i
n
i
i
n
i
k
i
n
i
i
x y x x x
x y x x x
y x x n
1 1
2
1
1
1
1 1
1
1
2
1
1 1 1
k
k
k
| | |
| | |
| | |


Fitting a Linear Function of Several Variables
c | | | |
k
+ + + + + =
2 1 0 k
x x x y
2 1
Normal equations:

= |
.
|

\
|
+ + |
.
|

\
|
+ |
.
|

\
|
=
= |
.
|

\
|
+ + |
.
|

\
|
+ |
.
|

\
|
= |
.
|

\
|
+ + |
.
|

\
|
+



= = =
1
=
0
= = =
1
=
0
= = =
1 0
n
i
ik i
n
i
ik
n
i
i ik
n
i
ik
n
i
i i
n
i
ik i
n
i
i
n
i
i
n
i
i
n
i
ik
n
i
i
x y x x x x
x y x x x x
y x x n
1 1
2
1
1
1
1
1
1
1
1
2
1
1
1 1 1
1
k
k
k
| | |
| | |
| | |



LS criterion :

Minimize D= ( )
2
1
1
n
i i k
i
y x x x
k
| | | |
0 1 2
=
(
+ + + +


( )
k
| | | | , , ,
1

0
=

Set 0 , 0, 1, 2, ,
j
D
for j k
|
c
= =
c

Matrix Form of Multiple Regression by LS

(
(
(
(

+
(
(
(
(

(
(
(
(
(

=
(
(
(
(

1
0
n k
nk n n
k
k
n
x x x
x x x
x x x
y
y
y
c
c
c
|
|
|

2
1
2 1
2 22 21
1 12 11
2
1
1
1
1


(Note: ij
x
= i
th
observation of the j
th
independent variable)
or y = X | + c in short

LS criterion is:
min ( ) ( ) X - y ' X - y '
1
2

=
= = =
n
i
i
D c


Set 0

=
c
c

D
, and result in: 0 X y X
^
) - ( ' =
The LS solutions are: ( ) y X' X X'
1
=

Measure of Goodness-of-Fit
R
2
= Coefficient of Determination

( )

=
=
n
1 i
2
y
i
y
n
1 i
2
i

1

= 1 - % of variation in the dependent variable, y, unexplained by
the regression equation;
= % of variation in the dependent variable, y, explained by the
regression equation.
Example 1 (LS Method)
Example 1 (LS Method)
LS Example
LS Example (Matrix Approach)

LS Example (by Minitab w/ |
0
)
LS Example (by Minitab w/o |
0
)
LS Example (Output Plots)

Вам также может понравиться