Вы находитесь на странице: 1из 7

# 6.2.

## 6.2.2 PACF of ARMA(p,q)

We have seen earlier that the autocorrelation function of MA(q) models is zero
for all lags greater than q as these are q-correlated processes. Hence, the ACF is a
good indication of the order of the process. However AR(p) and ARMA(p,q) pro-
cesses are fully correlated, their ACF tails off and never becomes zero, though
it may be very close to zero. In such cases it is difficult to identify the process on
the ACF basis only.

In this section we will consider another correlation function, which together with
the ACF will help to identify the models. The function is called Partial Autocor-
relation Function (PACF). Before introducing a formal definition of PACF we
motivate the idea for AR(1). Let
Xt = Xt1 + Zt
be a causal AR(1) process. Then
(2) = cov(Xt , Xt2 )
= cov(Xt1 + Zt , Xt2 )
= cov(2 Xt2 + Zt1 + Zt , Xt2 )
= E[(2 Xt2 + Zt1 + Zt )Xt2 ]
= 2 (0).
The autocorrelation is not zero because Xt depends on Xt2 through Xt1 . Due
to the iterative kind of AR models there is a chain of dependence. We can break
this dependence removing the influence of Xt1 from both Xt and Xt2 to obtain
Xt Xt1 and Xt2 Xt1
for which the covariance is zero, i.e.,
cov(Xt Xt1 , Xt2 Xt1 ) = cov(Zt , Xt2 Xt1 ) = 0.
Similarly, we obtain zero covariance for Xt and Xt3 after breaking the chain
of dependence, i.e. removing the dependence of the two variables on Xt1 and
Xt2 , i.e. for Xt f (Xt1 , Xt2 ) and Xt3 f (Xt1 , Xt2 ) for some func-
tion f . Continuing this we would obtain zero covariances for variables Xt
f (Xt1 , Xt2 , . . . , Xt +1 ) and Xt f (Xt1 , Xt2 , . . . , Xt +1 ). Then the
only nonzero covariance is for Xt and Xt1 (nothing in between to break the chain
of dependence). These covariances with an appropriate function f divided by the
variance of the process are the partial autocorrelations. Hence, for a causal AR(1)
process we would have the PACF at lag 1 equal to (1) and at lags > 1 equal to 0.
This, together with the tailing off shape of the ACF identifies the process.
116 CHAPTER 6. ARMA MODELS

## Definition 6.2. The Partial Autocorrelation Function (PACF) of a zero-mean sta-

tionary TS {Xt }t=0,1,... is defined as

11 = corr(X1 , X0 ) = (1)
(6.17)
= corr(X f( 1) , X0 f( 1) ), 2,

where
f( 1) = f (X 1, . . . , X1 )
minimizes the mean square linear prediction error

E(X f( 1) )2 .

Remark 6.4. The subscript at the f function denotes the number of variables the
function depends on.
Remark 6.5. By stationarity, is the correlation between variables Xt and Xt
with the linear effect

f (Xt1 , . . . , Xt +1 ) = 1 Xt1 + . . . + 1 Xt +1

## Example 6.5. The PACF of AR(1)

Consider a process

Xt = Xt1 + Zt , Zt W N(0, 2 ),

11 = (1) = .

## To calculate 22 we need to find the function f(1) which is of the form

f(1) = X1 .

We choose to minimize

## E(X2 X1 )2 = E(X22 2X1 X2 + 2 X12 )

= (0) 2(1) + 2 (0)

## which is a polynomial in . Taking the derivative with respect to and setting it

equal to zero, we obtain
2(1) + 2(0) = 0.
6.2. ACF AND PACF OF ARMA(P,Q) 117

Hence
(1)
= = (1) =
(0)
and
f(1) = X1 .
Then

22 = corr(X2 X1 , X0 X1 ) = corr(Z2 , X0 X1 ) = 0

## as by causality X0 , X1 do not depend on Z2 . Similarly we would obtain 33 = 0.

In fact
= 0 for > 1.
The PACF of AR(p)

Let
Xt 1 Xt1 . . . p Xtp = Zt , Zt W M(0, 2 )
be a causal AR(p) process, i.e., we assume that the roots of (z) are outside the
unit circle. When > p the linear combination minimizing the mean square linear
prediction error is
X
p
f(p) = j X j .
j=1

We will discuss this result later. Now we will use it to obtain the PACF for > p,
namely
= corr(X f(p) , X0 f(p) )
= corr(Z , X0 f(p) ) = 0
as by causality X j , do not depend on the future noise value Z .

## Remark 6.6. The PACF of MA(q)

Let
Xt = Zt + 1 Zt1 + . . . + q Ztq , Zt W N(0, 2 )
be an invertible MA(q) process, i.e., roots of (z) lie outside the unit circle. Then
its linear representation is
X

Xt = j Xtj + Zt .
j=1
118 CHAPTER 6. ARMA MODELS

10 70 130 190

AR1.phi.0.9 AR1.phi.minus0.9

-2

-4
AR1.phi.0.5 AR1.phi.minus0.5

-2

-4

10 70 130 190

Figure 6.3: AR(1) for various values of the parameters = 0.9, 0.9, 0.5, 0.5.
6.2. ACF AND PACF OF ARMA(P,Q) 119

## Series : ARprocess\$AR1phi0.9 Series : ARprocess\$AR1.phi.0.9

1.0

0.8
0.8

0.6
0.6

Partial ACF
0.4
ACF
0.4

0.2
0.2

0.0
0.0

0 20 40 60 80 100 0 5 10 15 20
Lag Lag

## Series : ARprocess\$AR1.2 Series : ARprocess\$AR1.phi.minus0.9

1.0

0.0
0.5

-0.2
Partial ACF
ACF
0.0

-0.4 -0.6
-0.5

-0.8

0 20 40 60 80 100 0 5 10 15 20
Lag Lag

## Series : ARprocess\$AR1.4 Series : ARprocess\$AR1.phi.0.5

1.0
0.8

0.4
0.6

Partial ACF
0.4

0.2
ACF
0.2
0.0

0.0
-0.2

0 20 40 60 80 100 0 5 10 15 20
Lag Lag

## Series : ARprocess\$AR1.3 Series : ARprocess\$AR1.phi.minus0.5

1.0

0.1
0.8

0.0
0.6
0.4

Partial ACF
-0.1
ACF
0.2

-0.2
0.0

-0.3
-0.2

-0.4
-0.4

0 20 40 60 80 100 0 5 10 15 20
Lag Lag

## Figure 6.7: ACF and PACF of the AR(1) process xt = 0.5xt1 + zt .

120 CHAPTER 6. ARMA MODELS

Series : AR2\$x

0.6
0.4
Partial ACF
0.2 0.0
-0.2

0 5 10 15 20
Lag

## Figure 6.8: The PACF for AR(2) xt 0.7xt1 + 0.1xt2 = zt .

MA(1)
3

-1

-3

-5

10 30 50 70 90 t
Series : MA1\$theta09 Series : MA1\$theta09
1.0

0.4
0.8

0.2
0.6

Partial ACF
ACF
0.4

0.0
0.2
0.0

-0.2
-0.2

0 5 10 15 0 5 10 15
Lag Lag

## Figure 6.9: ACF and PACF of the MA(1) process xt = zt + 0.9zt1 .

6.2. ACF AND PACF OF ARMA(P,Q) 121

This is an AR() representation (p = ) and the PACF will never cut off as for
the AR(p) with finite p.

The PACF of MA models behaves like ACF for AR models and PACF for AR
models behaves like ACF for MA models.
It can be shown that PACF of MA(1) is

() (1 2 )
= , 1.
1 2( +1)

## Remark 6.7. The PACF of ARMA(p,q)

An invertible ARMA model has an infinite AR representation, hence the PACF
will not cut off.
The following table summarizes the behaviour of the PACF of the causal and in-
vertible ARMA models (see R.H.Shumway and Stoffer (2000)).

## AR(p) MA(q) ARMA(p,q)

ACF Tails off Cuts off after lag q Tails off
PACF Cuts off after lag p Tails off Tails off