Вы находитесь на странице: 1из 5

ECON30402 Time Series Econometrics

Outline Solutions for 2013 Exam


1.
yt = + t + 1 t1 + 2 t2
(a)

(1)

i. Mean
E[yt ] =
Variance
var[yt ]

E[yt ]2

E [t + 1 t1 + 2 t2 ]

(1 + 12 + 22 )E[2ts ] as t is white noise

(1 + 12 + 22 ) 2

Autocovariance
cov[yt , ytj ]

= E[yt ][ytj ]
= E [t + 1 t1 + 2 t2 ] [tj + 1 tj1 + 2 tj2 ]

j=1
(1 + 1 2 ) 2
2 2
j=2
=

0
j>2

again using the white noise property of t .


ii. Autocorrelations:
cor[yt , ytj ]

cov[yt , ytj ]
var[yt ]
+
1
1 2

1+12 +22

2
1+12 +22

j=1
j=2
j>2

iii. See notes on stationarity.


iv. M A() given by
yt = +

s ts

(2)

s=0

A. Absolute summability condition states

|s | <

s=0

B. For 1 = 1 + 1 , s = 1 s1 , s > 1, we have


yt
yt1
yt 1 yt

= + t + 1 t1 + 1 1 t2 + 21 1 t2 + ...
=

+ t1 + 1 t2 + 1 1 t3 + ...

= 1 + t + (1 1 )t1
=

+ 1 t1

as required.
C. Substituting
s = 1 s1 s = 1s1 1 ,
1

s>1

D. Therefore

|s | =

s=0

1+

X
s1
1
1

s=1

1 + |1 |

X
s1

1

s=1

has a finite sum only if |1 | < 1. Consequently, the stationarity condition for the
ARM A(1, 1) is |1 | < 1.
(b) Seasonal model
yt = 0 + 1 D2t + 2 D3t + 3 D4t + 4 yt4 + t

(3)

i. The quarterly dummy variables imply a different intercept applies to each quarter of the
year, specifically (0 + 1 ), (0 + 2 ), (0 + 3 ), and 0 for quarters 1, 2, 3 and 4
respectively. The model allows for different E[yt |q ] over q = 1, 2, 3, 4.
ii. Prediction.
A. Applying the model for t in fourth quarter
ybt+1

= 0 + 1 + 4 yt3

ybt+2

= 0 + 2 + 4 yt2

ybt+3

= 0 + 3 + 4 yt1

ybt+4

= 0 + 4 yt

ybt+5

= 0 + 1 + 4 ybt+1

B. By comparing these expressions with (3), it is clear that


M SE(i) = E[yt+i ybt+i ]2 = E[2t+i ] = 2 ,

i = 1, 2, 3, 4

For a horizon of i = 5,
M SE(5)

E[yt+5 ybt+5 ]2

E[t+5 + 4 t+1 ]2 = 2 (1 + 24 )

C. As the horizon i , prediction for each quarter will approach the respective unconditional mean E[yt |q ] for q = 1, 2, 3, 4.
iii. This is bookwork. Students should state that the model (3) is estimated and residuals et
retrieved. Then the auxiliary regression
yt

0 + 1 D2t + 2 D3t + 3 D4t + 4 yt4


+1 et1 + 2 et2 + 3 et3 + 4 et4 + ut

is estimated. [Students may equivalently use et as the dependent variable here.] The null
and alternative hypotheses are
H0

: 1 = 3 = 3 = 4 = 0

HA

at least one j 6= 0, j = 1, ..., 4

and the test is implemented as a standard F -test (asymptotically valid). The model
is designed to capture seasonality in quarterly data; 4 lags provides information about
whether all the pattern, including seasonality, has been captured.
iv. Application.

A. The graph indicates that the series DT CF is highly seasonal, with a persistent pattern where the fourth quarter shows the highest growth of sales, with a large drop in
the first quarter and typically small positive values in the second and third quarters.
This provides evidence that the means vary over the quarters; otherwise the series
appears to be stationary, with the overall level approximately constant and no clearcut change in variance over time.
The correlogram after removal of seasonal means continues to show evidence of seasonality, apparently of a seasonal AR form, with roughly geometric decline at lags
4, 8, 12, 16; these are generally significant according to the white noise confidence
bands. Some serial correlation is also suggested at nonseasonal lags, especially lags 1
and 3.
B. The estimated model reproduces the characteristics of the graph, with significant and
negative seasonal dummy variable coefficients for quarters 1, 2 and 3, compared to
the base of quarter 4. In line with the correlogram, there and significant stochastic
seasonality with b = 0.68, positive as anticipated. The residual serial correlation test
shows no significant autocorrelation at conventional levels of signficance (p-value 0.11).
Despite this, the model cannot capture nonseasonal dynamics, which the correlogram
of part i indicated may be present. Therefore, although the serial correlation test is
satisfactory, it may be worth investigating whether the addition of nonseasonal AR
or M A term(s) improve the model.
(c) Bivariate V AR(1)
yt = + 1 yt1 + t

(4)

i. V ARM A representation
A. Write the system as
[I2 1 L]yt = + t
where


I2 1 L =

1 11 L 12 L
21 L 1 22 L

Assuming stationarity, (L) = I 1 L is nonsingular, and




1
1 22 L
12 L
1
.
[(L)] =
21 L
1 11 L
|(L)|
Multiplying through by the determinant |(L)|, this becomes


 


1 22
12
1
1 22 L
12 L
1t
|(L)| yt =
+
21
1 11
2
21 L
1 11 L
2t
|(L)| =

(1 11 L)(1 22 L) 12 21 L2

which is the V ARM A representation.


B. Univariate ARM A process (L)yt = + (L)t is stationary if all roots of the characteristic equation corresponding to (L) are less than one in absolute value. In the
V ARM A representation all elements of yt have the same AR polynomial |(L)| , and
the V AR is stationary if all characeristic roots correponding to |(L)| are less than
one in absolute value.
ii. Specific


1.0 0.2
1 =
.
0.3
0.5
A. Then

1L
0.2L
|(L)| =
0.3L 1 0.5L

(1 L)(1 0.5L) + 0.06L2

1 1.5L + 0.56L2

with characteristic equation


2 1.5 + 0.562 = ( 0.8)( 0.7)
with roots 1 = 0.8, 2 = 0.7. The V AR is stationary as both |i | < 1, i = 1, 2.
B. The required impulse response functions are given by the V M A coefficient matrices,
s , for s = 0, 1, 2. We need to compute
2

=
=
=

21


1.0 0.2
1.0
0.3
0.5
0.3


0.94 0.30
0.45
0.19

0.2
0.5

The impulse responses for the effects of a unit shock to y1t are, for s = 0, 1, 2:
Effect on y1,t+s

1, 1, 0.94

Effect on y2,t+s

0, 0.3, 0.45

The impulse responses for the effects of a unit shock to y2t are, for s = 0, 1, 2:
Effect on y1,t+s

0, 0.2, 0.3

Effect on y2,t+s

1, 0.5, 0.19

iii. See notes on Cholesky Method.


iv. A. Repeated substitution yields
yt

(yt2 + + t1 ) + + t

= yt2 + 2 + t + t1
=

(yt3 + + t2 ) + 2 + t + t1

= yt3 + 3 + t + t1 + t2
t
X
= y0 + t +
i
i=1

B. From part (a) we have


E[yt ]

y0 + t

var[yt ]

E[yt E(yt )]2


" t
#2
X
E
i

i=1

" t
X

#
2i

as E[i j ] = 0, i 6= j

i=1

=
cov[yt , ytj ]

t 2

E[yt E(yt )][ytj E(ytj )]


" t
# " tj #
X
X
E
i
s

E[t + t1 + t2 + ... + 1 ]

s=1

i=1

[tj + tj1 + ... + 1 ]


since t is white noise
=
4

(t j) 2

C. From (b),
cor[yt , ytj ]

cov[yt , ytj ]
var[yt ] var[ytj ]
(t j) 2

t 2 (t j) 2
r
tj
tj
p
=
t
t(t j)
1 as t for all finite j > 0

D.
E. Cointegration exists when there a linear combination
z1t = y1t 2 y2t 3 y3t I(0)
F. For three series, there could be 2 distinct linear combinations which are I(0), such as
z1t

y1t 2 y2t

z2t

y1t 3 y3t

G. For three variables that are each I(1), the Engle-Granger test examines whether there
exists a linear combination of these which is stationary. The intuition is that if such
a linear combination exists, then the residuals from a regression such as
y3t = 0 + 1 t + 1 y1t + 2 y2t + u3t
should be stationary. A trend is included in the cointegrating regression, as in the
EViews output, to take account of linear trends in the individual series. Denoting the
residuals as et , an ADF test is applied to the residuals from this regression as
et = et1 +

p
X

j etj + vt

j=1

and the hypotheses tested are


H0

: =0

HA

: <0

The augmentation order p should be sufficient to account for serial correlation in


the residuals; no deterministic terms are required, since these are taken account of
through the initial regression.
For the particular case, the ADF test statistic is -3.529. Using the McKinnon critical
values for k = 3 regressors and with a trend in the original equation, this statistic is
not significant at even the 10% level. Therefore, the null hypothesis is not rejected
and it is concluded that that the evidence does not support cointegration between
these series.

Вам также может понравиться