Вы находитесь на странице: 1из 35

FORECASTING TIME SERIES

WITH ARMA and ARIMA MODELS


THE BOX-JENKINS
METHODOLOGY

Univariate time series models


ARMA models of stationary time series
An AR process, and its properties
A MA process, and its properties
Models in which we attempt to predict a single
time series using its past values
Used for short-term forecasting
A-theoretical, unlike structural models

A white noise process


Throughout these notes, the disturbance
(or error) term t is a white noise process
(for all t).
By definition, if t is white noise, for all t:
E ( t ) 0
Var ( t )

Cov( t , t s ) 0

for s 0

Consider first only stationary


time series
See previous notes for definition of a stationary
time series
Two basic building blocks for ARMA modelling
and forecasting:
the sample autocorrelation function (ACF)
the sample partial autocorrelation function
(PACF)

SAMPLE AUTOCORRELATION FUNCTION (ACF,


also known as correlogram)
Cov Yt , Yt k
Corr Yt , Yt k rk
k 0,1,2,...
Var (Yt ) Var (Yt k )
We want to obtain rk for k=1,2,3,
This is easily done in Microfit using the COR command, or in
GiveWin using "Tools" then "Graphics" then via the "Time
Series" menu after selecting the variable.

Sample ACF of a white noise process (series consists


of 400 independent drawings from N(0,1) distribution)
1.00
ACF-u
0.75

0.50

0.25

0.00

-0.25

-0.50

-0.75

10

SAMPLE PARTIAL AUTOCORRELATION


FUNCTION
The kth order estimated or sample PAC coefficient,
denoted here kk , is obtained as the parameter
estimate of k in the kth order autoregression

Yt 0 1Yt 1 2 Yt 2 ... k Yt k t

1.00
PACF-u
0.75

Sample PACF of a white noise


series (GiveWin output)

0.50

0.25

0.00

-0.25

-0.50

-0.75

10

AUTOREGRESSIVE PROCESSES
If Y is an autoregressive process of order 1 [AR (1)] it can
be written as

Y t = 0 + 1 Y t -1 + t , t = 1,..., T
t ~ NID(0, 2)

AUTOREGRESSIVE PROCESSES
If Y is an autoregressive process of order p [AR (p)] it can be written as

Y t = 0 + 1 Y t -1 + ... + p Y t -p + t , t = 1,..., T
2)
~
NID(0,
t

Y2 an AR(1) series:
Y2t = 2 +0.5Y2t-1 + t

Y2

7
6
5
4
3
2
1
0
0

50

100

150

200

250

300

350

400

MA (1) PROCESSES
If Y is a moving average process of order 1 is can be written as

Y t = + t + 1 t -1

MA (q) PROCESSES
If Y is a moving average process of order q it can be written as

Yt = + t + 1 t-1 + 2 t -2 + ... + q t-q

An MA(2) process

Y1
5

-1
0

50

100

150

200

250

300

350

400

An ARMA process
Y as an ARMA (p,q) model:
Yt 1Yt 1 ... p Yt p t 1 t 1 ... q t q

An ARMA(p,q) model can, by suitable choice of p and q ,


mimic any stationary time series of interest.
This suggests that given a particular stationary time series, Y t,
if we can:
find the appropriate values of p and q (i.e. identify the
model), and
estimate the parameters of the ARMA model from the past
history of Yt (i.e. estimate the model)
then the ARMA model can be used to predict future values of
Yt.

AUTOREGRESSIVE processes and the ACF


and PACF
For a pure AR(p) process:
The sample autocorrelation function (ACF) tends to taper
off gradually as k increases
The sample partial autocorrelation function (PACF) will
tend to show a cut off point, corresponding to
the fact that the sample partial autocorrelations are
non-zero for k p, but will be approximately zero for k>p.

1.0

ACF-Y2

Y2, an AR(1) series

0.5
0.0
-0.5

0
1.0

10

10

PACF-Y2

0.5
0.0
-0.5

MA processes and the ACF and PACF

In terms of the estimated ACF and PACF functions, the distinguishing


features of a pure MA (q) process are
ACF has a cut-off point, such that the estimated autocorrelations, r k ,are
close to zero for all k > q
the estimated PACF tends to taper off gradually as k increases.

1.0

ACF-Y1

MA(2) series

0.5
0.0
-0.5

0
1.0

10

10

PACF-Y1

0.5
0.0
-0.5

Characteristics of ARMA models


Model

ACF

PACF

White noise

All zero

All zero

MA(1)

Zero after 1 lag

Declines from 1st lag

MA(2)

Zero after 2 lags

Declines from 2nd lag

MA(q)

Zero after q lags

Declines from gth lag

AR(1)

Geometric decline from 1 lag

Zero after 1 lag

AR(2)

Geometric decline from 2 lags

Zero after 2 lags

AR(p)

Geometric decline from pth lag

Zero after p lags

ARMA(1,1)

Geometric decline from 1 lag

Declines from 1st lag

ARMA(p,q)

Geometric decline from pth lag

Declines from gth lag

BOX 3: IDENTIFICATION OF ARMA MODELS: A SUMMARY OF


PROPERTIES OF THE AC AND PAC FUNCTIONS
(Adapted from Enders, page 85)
PROCESS
White noise
AR(1): >0
AR(1): <0
AR(p)

MA(1): 1<0
MA(1): 1>0
MA(q)

ARMA(1,1) :
>0
ARMA(1,1) :
<0
ARMA(p,q)

ACF
All k = 0
Direct exponential decay:
k =
Oscillating decay: k =
Decays towards zero.
Coefficients may oscillate.

Negative spike at lag 1.


k = 0 for k 2
Positive spike at lag 1.
k = 0 for k 2
k 0 for k q
k = 0 for k > q
i.e. a cut-off in the ACF
Exponential decay beginning
at lag 1
Oscillating decay beginning
at lag 1
Decay (either direct or
oscillatory) at lag q

I(1) or I(2) series k tapers off very slowly


or not at all

PACF
All kk = 0
11 = 1; kk = 0 for k 2
11 = 1; kk = 0 for k 2
Spikes through until lag p,
followed by cut off in
PAC function beyond lag p.
kk 0 for k p.
kk = 0 for k > p.
Exponential decay: 11 < 0
Exponential decay: 11 > 0
kk taper off
Oscillatory decay beginning
at lag 1. 11 = 1
Exponential decay beginning
at lag 1. 11 = 1
Decay (either direct or
oscillatory) beginning at lag p,
but no distinct cut-off point.

TESTS ON PARTIAL AUTOCORRELATIONS


Tests on whether the PAC coefficient at some lag length k is statistically
different from zero may be carried out in the following way. Let kk
denote the k order PAC and
test the hypothesis

kk be its estimated counterpart. We wish to

H0 : kk = 0 vs Ha : kk 0
We use the estimated (or sample) partial autocorrelation coefficients as
proxies for the true autocorrelation coefficients, and exploit the (approximate)
distributional result that if the null hypothesis is true

kk ~ N( 0,

1
)
T

Thus an approximate 95% confidence interval for testing significance of a


partial autocorrelation coefficient is given by

kk

2
T

THE BOX JENKINS METHOD: NEXT STEPS


Two further steps are required:
1. Estimate the parameters of the ARMA model for the variable Y.
2. Obtain dynamic forecasts of Y from the estimated ARMA model.
Dynamic forecasts can be found recursively.

YT +1 = 1 YT + 2 YT-1 + T +1 + 1 T + 2 T-1 +
Our one period ahead forecast is the expectation of Y T+1 conditional upon
the past history of Y, and is obtained as

=
+
+
+
+

Y
Y
YT +1 1 T
2 T -1 1 T 2 T -1

AR(1) Model: True: Y2t = 2 + 0.5 Y2t-1 + t

Use the ARFIMA package


within PcGive

AR(1) Model: True: Y2t = 2 + 0.5 Y2t-1 + t

PcGive ARFIMA package produces forecasts automatically use


cursor to read off forecast values (go into Test after estimation)

PcGive estimation of the artificial


generated MA(2) series, Y1
Y1t = 2 + 0.5 t-1 0.25 t-2 + t

(Here I retained 10 observations


from the full sample for forecasting)

Forecasts

Y1

380

385

390

395

400

An ARMA(p,q) model can, by suitable choice of p and q , mimic any stationary


time series of interest.
This suggests that given a particular stationary time series, Y t, if we can:
find the appropriate values of p and q (i.e. identify the model), and
estimate the parameters of the ARMA model from the past history of Y t (i.e.
estimate the model)
then the ARMA model can be used to predict future values of Y t.
Our objective is to obtain forecasts of Y for up to k periods ahead. Let T denote
the current period, let the one period ahead forecast value be Y T1 , and the i
period ahead forecast (for i = 1, ...k) be Y T i .

The forecast of YT+2 is obtained as

T +2 = 1 Y
T +1 + 2 YT + 2 T +
Y
in which we use the previous forecast of YT+1 recursively as an
explanatory variable.

NON-STATIONARY TIME SERIES: ARIMA


MODELS
What do we do if Y is non-stationary, but can be
made stationary by differencing one or more times?
Answer: Difference the series until it is stationary.
(This will usually just require first differencing.)
Call the differenced variable Z.
Then do ARMA modelling/forecasting on Z.
Finally, integrate (accumulate) the Z forecasts to
obtain forecasts for Y itself.

Вам также может понравиться