Вы находитесь на странице: 1из 21

Review: Stationary Time Series Models

White noise process, covariance stationary process, AR(p), MA(p) and ARIMA processes, stationarity conditions, diagnostic checks.
2 White noise process xt ~ i.i.d N (0, ) A sequence {xt } is a white noise process if each value in the sequence has 1. zero-mean E ( xt ) = E ( xt 1 ) = .. = 0

2. constant conditional variance

E ( x 2t ) = E ( xt 1 ) = .. 2 = Var ( xt i )

3. is uncorrelated with all other realizations

E ( xt xt s ) = E ( xt j xt j s ) =.. = 0 = Cov ( xt j xt j s )

Properties 1&2 : absence of serial correlation or predictability Property 3 : Conditional homoscedasticity (constant conditional variance). Covariance Stationarity (weakly stationarity) A sequence {xt } is covariance stationary if the mean, var and autocov do not grow over time, i.e. it has 1. finite mean E ( xt ) = E ( xt 1 ) = .. = 2. finite variance E[( x t ) 2 ] = E[( xt 1 ) 2 ] = .. = x = Varx 3. finite autocovariance
2

E[( xt )( xt s )] = E[( xt j )( xt j s )] = .. = s = Cov ( xt j xt j s )

Ex. autocovariance between xt , xt s = s =

s 0

But white noise process does not explain macro variables characterized by persistence so we need AR and MA features. Application: plot wn and lyus Program: whitenoise.prg workfile: usuk.wf page 1 The program generates and plots log of US real disposable income and a white noise process WN, based on the sample mean and variance of log US income (nrnd=normal random variable with 0 mean and SD of 0.36) lyus=log(uspdispid) WN= 8.03+0.36*nrnd

9.0

8.5

8.0

7.5

7.0 1960 1965 1970 1975 1980 1985 1990 1995 WN LYUS

AR(1): xt = xt 1 + et et ~ i.i.d (0, 2 ) , MA(1): xt = et +et 1

(random walk: =1 )

More generally: AR(p): xt = 1 xt 1 + 2 xt 2 +... + p xt p + et MA(q): xt = et +1et 1 +2et 2 +... +q et q ARMA(p,q): xt = 1 xt 1 + 2 xt 2 +... + p xt p + et +1et 1 +.. +q et q Using the lag operator: AR(1): (1 L ) xt = et MA(1): xt = (1 +L)et 2 P AR(p): (1 1 L 2 L ... p L ) xt = ( L) xt = et
2 q MA(q): xt = (1 +1L +2 L + ... +q L )et = ( L)et ARMA(p,q): a ( L) xt = b( L)et

1. AR process Stationarity Conditions for an AR(1) process (1 L ) xt = et with ( L) = 1 L and substituting for L: ( z ) = 1 z The process is stable if ( z ) 0 for all numbers satisfying
xt = (1 L) et = et i i =0
1 i

z 1.

Then we can write

If x is stable, it is covariance stationary: 1. E ( xt ) = or 0 finite

e = 0 -- finite 2. Varx = E[ x t ] = E (i = 0 et i ) = e i = 0 = 1 2
2 i 2 2 2 2i

3. covariances 2 1 = E ( xt xt 1 ) = E[( xt 1 + et ) xt 1 ] = x 2 2 = E ( xt xt 2 ) = E[(( xt 2 + et 1 ) + et ) xt 2 ] = 2 E ( xt22 ) = 2 x 2 s = E ( xt xt s ) = s x = sVar ( x ) Autocorrelations between xt , xt s : rs = s = s 0 Plot of rs over time = Autocorrelation function (ACF) or correlogram. For stationary series, ACF should converge to 0: lim rs = 0 if < 1
s

> 0 direct convergence < 0 dampened oscillatory path around 0.

Partial Autocorrelation (PAC) Ref: Enders Ch.2 In AR(p) processes all xs are correlated even if they dont appear in the regression equation. Ex: AR(1) s = sVar ( xt ) r1 = 1 = ; r2 = 2 = 2 = r1 = r12 ; r3 = 3 = 3 = r2 r1 0 0 0

We want to see the direct autocorrelation between xt j and xt by controlling for all xs between the two. For this, construct the demeaned series and form regressions to get the PAC from the ACs. 1st PAC: * 11 = r1 xt* = 11 xt 1 + et 2nd PAC:
xt* =21 xt 1 +22 xt*2 + et
*

22 =

r2 r12 . 1 r12

In general, for s 3 , sth PAC:


s

ss =

rs j =1s 1, j rs j 1 j =1s 1, j rj
s 1

and sj = s 1, j ss s 1, s j .

Ex: for s=3, 33 =

r3 (21r2 + 22 r1 ) . 1 (21r1 + 22 r2 )

Identification for an AR(p) process PACF for s>p: ss = 0 Hence AR(1): 22 =

r2 11r r2 r12 r12 r12 = = =0 1 11r1 1 r12 1 r12

33 =

r3 (21r2 + 22 r1 ) 1 (21r1 + 22 r2 )

To evaluate it, use the relation sj = s 1, j ss s 1, s j : 21 = 1,1 22 11 = 11 = r 1 , substitute it to get: 3 3 r rr r r1 33 = 3 12 2 = 1 2 =0 1 r1 1 r1

Stability condition for an AR(p) process (1 1 L 2 L2 ... p LP ) xt = ( L ) xt = et 1 , or if the roots of the The process is stable if ( z ) 0 for all z satisfying z characteristic polynomial lie outside the unit circle. Then, we can write: = ai et i . = a ( L)et xt = ( L ) 1 et j= 0 Then we have the usual moment conditions: 1. E ( xt ) = or 0 finite

2. Varx = E[ x

] = E (i =0 ai et i ) 2 , a0 = 1, et i et j = 0

Varx = e i = 0 ai = e 2 = 0 -- finite variance, hence time independent. 1 a


2 2

3. covariances
2 s = E ( xt xt s ) = E[(et + a1et 1 + a2et 2 + ...)(et s + a1et s 1 + ...)] = 2 E ( xt22 ) = 2 x

2 2 = [(as + a1a1+ s + a2 a2 + s + ...) e = e i = 0 ai ai + s finite and time

independent. a a + a a + a3a4 + ... r1 = 1 = 1 2 2 3 0 (a1 + a2 + ...) 2

rs =

s = 0

aa a
i

i i+ s 2

If the process is nonstationary, what do we do? Then there is a unit root, i.e. the polynomial has a root for z=1 (1) = 0 . We can thus factor out the operator and transform the process into a first-difference stationary series: ( L) = (1 *1 L *2 L2 ... * p 1 Lp 1 )(1 L)
(1 *1 L *2 L2 ... * p 1 Lp 1 ) xt = et -- an AR(p-1) model.
2 p 1 If * ( L) = (1 *1 L *2 L ... * p1 L ) has all its roots outside the unit circle, xt is stationary: xt ~ I (1)

If * ( L) still has a unit root, we must difference it further until we obtain a stationary process: xt ~ I (d ) An integrated process = a unit root process. unconditional mean is still finite but Variance is time dependent Covariance is time dependent (more later). Application: generate an AR(1) series Program: ARMA.prg Workfile: USUK.wf, page 2 (undated)
Program: smpl 1 1 genr x=0 smpl 2 200 series x=0.5*x(-1)+NRND '

'nrnd=normal random variable with 0 mean and SD of 0.36

Go to workfile, Click on: series graph -- line


3 2 1 0 -1 -2 -3 25 50 75 100 X 125 150 175 200

On series, click on View Correlogram, level OK


Date: 09/04/07 Time: 19:43 Sample: 2 200 Included observations: 199 Autocorrelation .|**** .|** .|* .|. *|. *|. *|. .|. .|. *|. .|. .|. .|* .|* | | | | | | | | | | | | | | Partial Correlation .|**** .|. *|. .|. *|. .|. .|. .|. .|. *|. .|* .|. .|* .|. | | | | | | | | | | | | | | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 AC 0.537 0.257 0.074 0.003 -0.078 -0.087 -0.059 -0.029 0.013 -0.081 -0.015 0.010 0.076 0.101 PAC 0.537 -0.045 -0.065 -0.002 -0.087 -0.007 0.015 -0.002 0.036 -0.156 0.113 0.000 0.071 0.051 Q-Stat 58.316 71.715 72.832 72.834 74.074 75.654 76.369 76.549 76.586 77.970 78.017 78.039 79.294 81.484 Prob 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000

ACF=0 at lag 3 and PAC=0 at lag 2 hence AR(1). On Eviews: Q stats=Ljun-Box Q statistics and their p-values. H0: there is no autocorrelation up to 2 lag k, asymptotically distributed as q , q=# autocorrelations. Dotted lines: 2 SE bounds calculated approximately as 2 / T , T=# observations. Here: T=199 hence SE bounds =0.14. Program: smpl 1 1 genr xa=0 smpl 2 200 series xa= -0.5*xa(-1)+NRND rho=-0.5 dampened oscillatory path.

smpl 1 1 genr w=0 smpl 2 200 series w=w(-1)+NRND rho=1 random walk unit root.

2. MA process

xt = et + 1et 1 + 2 et 2 +... + q et q ,
2 q

e = 0 mean white noise error term. = ( L)et

= (1 +1 L +2 L + ... +q L )et

1 , the process is invertible, and has an AR ( ) representation: If ( z ) 0 for z 1 j xt j = et xt j =1 j xt j = et ( L) xt = ( L) xt = j= 0

xt = j =1 j xt j + et

Stability condition for MA(1) process xt = et +1et 1 1 Invertibility requires 1 < Then the AR representation would be: xt = (1 +1 L)et (1 + 1 L) 1 xt = et xt = x + et i =1 1i t i

E ( xt ) = 0 finite
2 Var ( xt ) = (0) = var(et + 1et 1 ) = (1 + 12 ) e finite.

(1) = 1Ee 2t 1 = 1 e2 r1 =

(2) = (3) = 0 r2 = r3 = 0 , hence autocorrelations cut off point = lag

(1) = 1 2 (0) 1 + 1

1 More generally : AC for MA(q)=0 for lag q.

PAC:

11 = r1 =

1 2 1 + 1
2

r r2 1 11 22 = 2 1 = = ( ) 11 2 2 1 r12 (1 + 1 )(1 + 12 1 ) 1 11
33 =
22 r1 1 21r1 =

22 22 22 11 11 = 11 = (?? 1 22 1 21 1 ( 11 11 (1 22 )) 11

check)

For AR: AC depends on the AC coefficient (rho), thus tapers off PAC depends on rs or s , cuts of 0 at s (AR(1): cutoff at L=1) For MA: AC depends on var of error terms: abrupt cutoff PAC depends on the MA coefficient , thus tapers off. 3. ARMA process ARMA(p,q): ( L) xt =( L)et 2 P 2 q (1 1L 2 L ... p L ) xt = (1 + 1 L +2 L +... +q L ) et If q=0 pure AR(p) process If p=0 pure MA(q) process

x + e are within the unit circle, If all characteristics roots of xt = i= 1 i t i i =0 i t i


p q

then this is an ARMA(p,q) process. If one or more roots lie outside the unit circle, then this is an integrated ARIMA(p,d,q) process. Stability condition for ARMA(1,1) process --Favero, p.37
xt = c1 xt 1 +et + a1et 1

(1 c1 ) xt = (1 + a1 L)et

1 + a1L xt = 1 c L et . 1

If

c1 < 1

then we can write

xt = (1 + a1L )(1 + c1 L + (c1L ) 2 + ...)et = (1 + c1 L + (c1L) 2 + ... + a1L + a1c1 L2 + ...)et


) = (1 + (a1 + c1 ) L + c1 (c1 + a1 ) L2 + c12 (c1 + a1 ) L3 + ...)et an MA(

representation.

E ( xt ) = 0 finite

(a1 + c1 ) 2 2 Var ( xt ) = (0) = 1 + 1 c 2 e finite 1


Covariances --finite (1) = c1Var ( xt ) + a1e2 (2) = c1 (c1Var ( xt ) + a1e2 ) = c1 (1) ( j ) = c1 ( j 1), Autocov functions :

j2

r1 =

(1) (0)

r2 = c1r 1
2 r3 = c1r2 = c1 r1

Any stationary time series can be represented with an ARMA model:


AR ( p ) MA() MA( p) AR ()

Application Plot and ARMA(1,1) and an AR(1) model with rho=0.7 and theta=0.4, and look at the AC and PAC functions. Prog: ARMA.prg File: USUK, page 2.
smpl 1 1 genr zarma=0 genr zar=0 smpl 1 200 genr u=NRND smpl 2 200 series zarma=0.7*x(-1)+u+0.4*u(-1) series zar=0.7*x*(-1) plot zarma zar zarma.correl(36) zar.correl(36)
4 3 2 1 0 -1 -2 -3 -4 25 50 75 100 125 150 175 200 ZARMA ZAR

Summary of results Enders Table 2.1 ACF WN AR(1)


rs = 0, s 0

Exponential Decay: rs = s > 0 direct decay < 0 oscillating MA(1) Positive (negative) spike at lag 1 for > 0 ( < 0 ) rs = 0 for s 2 ARMA(1,1) Exponential (oscillating) decay at lag 1 if > 0 ( < 0 ). Decay at lag q for ARMA(p,q)

PACF ss = 0, s 0 Spike at lag 1 (at p for AR(p)) 11 = r1 , ss = 0 for s 2

Oscillating (geometric) decay for 11 > 0 ( 11 < 0 ) Oscillating (exponential) decay at lag 1, 11 = 1 Decay after lag p for ARMA(p,q)

Stationary Time Series II


Model Specification and Diagnostic tests E+L&K Ch 2.5,2.6 So far we saw the theoretical properties of stationary TS. To decide about the type of model to use, we need to decide about the order of operators (# lags), the deterministic trends, etc. We therefore need to specify a model and then conduct tests on its specification, whether it represents the DGP adequately. 1. Order specification criteria: i. The t/F-statistics approach: start from a large number of lags, and reestimate by reducing by one lag every time. Stop when the last lag is significant. Monthly data: look at the t-stats for the last lag, and F-stats for the last quarter. Then check if the error term is white noise. Information Criteria: AIC, HQ or SBC Definition in Enders: Akaike: AIC=T.log(SSR)+2n Schwarz: SBC=T.log(SSR)+n.log(T) also called Schwartz and Risanen. Hannan-Quinn: HQC=T. log(SSR)+2n.log(log(T)) T=#observations, n=#parameters estimated, including the constant term. Adding additional lags will reduce the SSR, these criteria penalize the loss of degree of freedom that comes along with additional lags. The goal=pick the #lag that minimizes the information criteria. Since ln(T)>2, SBS selects a more parsimonious model than AIC. For large sample, HQC and in particular SBC is better than AIC since they have better large sample properties. o If you go along with SBC, verify that the error term is white noise. In small samples, AIC performs better. o If you go with AIC, then verify that the t-stats are significant. This is valid whether the processes are stationary or integrated.

ii.

Note: various software and authors use modified versions of these tests. As long as you use the criteria consistently among themselves, any version will give the same results. For ex: definition by LK: AIC=log(SSR) + 2(n/T) SC= log(SSR) + n(logT/T) HQ=log(SSR) + 2n.log(logT)/T) Definition used in Eviews : AIC=-2(l/T)+(2n/T) SC=-2(l/T)+n(logT/T) HQ=-2(l/T)+2n.log(logT)/T
where l is the log likelihood given by l = and

T (1 + log(2 ) + log SSR / T ) 2

is the constant of the log likelihood often omitted.

Denote the order selected by each criterion as the size of the sample:
( AIC ) p ( HQ ) p ( SC ) p

. The following holds independent of

Word of warning: It is difficult to distinguish between AR, MA and mixed ARMA processes based on sample information. The theoretical ACF and PACF are not usually replicated in real economic data, which have a complicated DGP. Look into other tests that analyze the residuals once a model is fitted: 2. Plotting the residuals: Check for outliers, structural breaks, nonhomogenous variances. Look at standardized residuals (subtract the mean and divide by standard
s = t~ deviation): u
u

u u
u

where u =

t u t ) 2 1 (u
T

~N(0, 2 ) then u s will be in general in a 2 band around is the mean. If u the 0 line.

is the standard deviation and

Look at the AC and PAC to check the remaining serial residuals in the residuals, and AC of squared residuals to check for conditional heteroscedasticity. If the AC and PAC of earlier lags are not in general within 2 / T band around 0 then there is probably left over serial dependence in the residuals or conditional heteroscedasticity.

3. Diagnostic tests for residuals: i. Test of whether the kth order autocorrelation is significantly different from zero: The null hypothesis: there is no residual autocorrelation up to order s The alternative: there is at least one nonzero autocorrelation.
H 0 : rk = 0 and k=1,..s H 1 : rk 0 for at least one k=1,..s.

where rk is the k-th autocorrelation. If the null is rejected, then at least one r is significantly different from zero. The null is rejected for large values of Q. If there are any remaining residual autocorrelation, must use a higher order of lag. The Box-Pierce Q-statistics (Portemanteau test for residual autocorrelation)
Q = T k =1 rk2 ~ 2 ( s )
s

T=# observations. But not reliable for small samples and it has reduced power for large s. Instead, use Ljung-Box Q statistics:
QLB = T (T + 2)k =1 rk2 /(T k )
s

with similar null and alternative hypotheses. It can also be used to check if the residuals from an estimated ARMA(p,q) model are white noise (adjust for the lags in AR(p) and MA(q)): Q~ 2 ( s p q ) or 2 ( s p q 1) with a constant. ii. Breusch-Godfrey (LM) test for autocorrelation for AR models for residuals: It considers an AR(h) model for residuals. Suppose the model you estimate is an AR(p): yt = + p j =1 j yt j + u t You fit an auxiliary equation

(*)

t u is the OLS residual from t = j =1 j yt j +i =1 i u t i + errort where u u


p h

the AR(p) model for y.


H 0 : 1 = .. = h = 0 H 1 : 1 0 or

2 0,....

The LM statistics for the null: LM h = T .R 2 ~ 2 ( h) , where R 2 is obtained from fitting (*). For better small sample properties use an F version:
FMLh = R2 T p h 1 ~ F ( h, T p h 1) h 1 R2

iii. Jarque-Bera test for nonnormality It tests if the standardized residuals are normally distributed, based on the third and fourth moments, by measuring the difference of the skewness and the kurtosis of the series with those from the normal distribution.
H 0 : E (uts )3 = 0 ( skewness) and H1 : E (uts )3 0 or E (uts ) 4 3 E (uts ) 4 = 3 (kurtosis )

JB ~ 2 ( 2) and the null is rejected if JB is large. In this case, residuals are considered nonnormal. Note:

most of the asymptotic results are also valid for nonnormal residuals. the results may be due to nonlinearities. Then you should look into ARCH effects or structural changes.

iv. ARCH-LM test for conditional heteroscedasticity Fit an ARCH(q) model to the estimation of the residuals q 2t = i = 0 i u 2t i + errort and test if u
H 0 : 1 = .. = q = 0 H 1 : 1 0 or no conditional heteroscedasticity

2 0,....

. Large values of ARCH-LM show that the null is rejected and there are ARCH effects in the residuals. Then fit an ARCH or GARCH model.

ARCH LM ( q ) = TR 2 ~ 2 ( q )

v. RESET Tests a model specification against alternatives (nonlinear). Ex: you are estimating a model
yt = axt + ut

But the actual models is


yt = axt + b zt + ut

where z can be missing variable(s) or a multiplicative relation. The test checks if powers of predicted values of y is significant. These consist of the powers and cross-product terms of the explanatory variables:
2, y 3 ,...} zt = { y
H 0 : b=0 --no misspecification

The test statistics has an F(h-1,T) distribution. The null is rejected if the test statistics is too large. vi. Stability analysis Recursive plot of residuals, of estimated coefficients; CUSUM test (cumulative sum of recursive residuals): if the plot diverges significantly from the zero line, it suggests structural instability. CUSUMSQR (square of CUSUM) in case there are several shifts in different directions. Chow test: for exogenous break points. Example: Workfile: LKEdata.wf page--Enders Series: US PPI Plot the data:
120 100 80 60 40 20 1960 1965 1970 1975 1980 1985 1990 1995 2000 PPI

Positive trend in the series indicates nonstationarity.

Correlogram:
Date: 09/19/07 Time: 10:00 Sample: 1960Q1 2002Q2 Included observations: 169 Autocorrelation .|******** .|******** .|*******| .|*******| .|*******| .|*******| .|*******| .|*******| .|*******| .|*******| .|*******| .|****** | Partial Correlation .|******** .|. | *|. | .|. | .|. | .|. | .|. | .|. | .|. | .|. | .|. | .|. | 1 2 3 4 5 6 7 8 9 10 11 12 AC 0.990 0.978 0.966 0.952 0.937 0.923 0.908 0.894 0.880 0.866 0.852 0.838 PAC 0.990 -0.044 -0.071 -0.056 -0.036 0.003 -0.003 -0.004 0.009 -0.003 -0.011 0.001 Q-Stat 168.44 334.05 496.37 655.05 809.87 960.86 1108.0 1251.5 1391.2 1527.5 1660.1 1789.4 Prob 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000

2 / T , 2 / T ] = [ 2 / 169, 2 / 169 ] = [ 0.154, +0.154] The confidence intervals [ Q statistics highly significant, the residuals are not white noise. Need to respecify the model. ACF dies out very slowly PACF (the autocorrelation conditional on the in-between values of time series): single spike Observation of the data suggests U.R. Hence we cannot use the models developed so far.

First difference dppi. Plot: it seems to be stationary, though the variance does not look constant, increases in the second half of the sample.
4 3 2 1 0 -1 -2 -3 -4 1960 1965 1970 1975 1980 1985 1990 1995 2000

DPPI

ACF dies out quickly after the 4th lag, and PACF has a large spike and dies out oscillating. Note the significant correlations and PAC at lag 6 and PAC at lag 8. This can be an AR(p) or an ARMA(p,q) process.
Included observations: 168 Autocorrelation .|**** | .|*** .|** .|** .|* .|* .|* *|. *|. .|. .|. .|. | | | | | | | | | | | Partial Correlation .|**** | .|. .|* .|. *|. .|* *|. *|. .|. .|* .|. .|. | | | | | | | | | | | 1 2 3 4 5 6 7 8 9 10 11 12 AC 0.553 0.335 0.319 0.216 0.086 0.153 0.082 -0.078 -0.080 0.023 -0.008 -0.006 PAC 0.553 0.043 0.170 -0.042 -0.081 0.149 -0.096 -0.149 -0.007 0.121 0.004 0.007 Q-Stat 52.210 71.571 89.229 97.389 98.682 102.82 104.02 105.10 106.23 106.33 106.34 106.35 Prob 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000

In Eviews: run equation dppi c AR(1) (You can also run equation dppi c dppi(-1), but with the first specification Eviews estimates the model with a nonlinear algorithm, which helps controlling for nonlinearities if they exist in the DGP).
Dependent Variable: DPPI Method: Least Squares Date: 09/19/07 Time: 10:13 Sample (adjusted): 1960Q3 2002Q1 Included observations: 167 after adjustments Convergence achieved after 3 iterations Variable C AR(1) R-squared Adjusted R-squared S.E. of regression Sum squared resid Log likelihood Durbin-Watson stat Coefficient 0.464364 0.554760 0.306907 0.302706 0.770679 98.00107 -192.4560 2.048637 Std. Error 0.133945 0.064902 t-Statistic 3.466835 8.547697 Prob. 0.0007 0.0000 0.466826 0.922923 2.328814 2.366155 73.06313 0.000000

Mean dependent var S.D. dependent var Akaike info criterion Schwarz criterion F-statistic Prob(F-statistic)

Inverted AR Roots

.55

Check the optimal lag length for an AR(p) process. Check the roots: if it is <1, stability is confirmed. View-ARMA structure-roots shows you the root(s) visually. View-ARMA structure-correlogram: allows you to compare the estimated correlations with the theoretical ones. View-ARMA structure-impulse response: shows how the DGP changes with a shock of the size of one standard deviation. View-Residual tests: to check if there is any residual autocorrelation left. Optimal lag length:
lags 1 2.328814* 2.366155* 2.565575 2.603069 2 2.575278 2.612925 3 2.735557 2.774324 10

AIC SBC

It appears that AR(1) dominates the rest in terms of the information criteria. Roots of AR(1): 0.55<1 stability condition satisfied, since the root is inside the unit circle.
Inverse Roots of AR/MA Polynomial(s) 1.5 1.0 0.5 0.0 -0.5 -1.0 -1.5 -1.5

-1.0

-0.5

0.0

0.5

1.0

1.5

AR roots

But serial correlation is a problem:


Breusch-Godfrey Serial Correlation LM Test: F-statistic Obs*R-squared 3.478636 6.836216 Prob. F(2,163) Prob. Chi-Square(2) 0.033159 0.032774

High Q stats (correlogram), and the Godfrey-Breusch test TR 2 =6.84> (22 ) = 5.99 rejects the null at the 5% confidence interval, thus there is at least one i 0 . There is serial correlation in the error terms. Lets try ARMA(p,q):
lags 1,1 1,2 2,1 2.345

AIC SBC

2.3305* 2.386*

2.3366 2.3926

2.402

Information criteria favor an ARMA(1,1) specification although the marginal change is very small. So we should consider all options. ARMA (1,1) -- Eviews: equation dppi c ar(1) ma(1)
Included observations: 167 after adjustments Convergence achieved after 9 iterations Backcast: 1960Q2 Variable C AR(1) MA(1) R-squared Adjusted R-squared S.E. of regression Sum squared resid Log likelihood Durbin-Watson stat Inverted AR Roots Inverted MA Roots Coefficient 0.455286 0.730798 -0.261703 0.313995 0.305629 0.769062 96.99884 -191.5976 1.935301 .73 .26 Std. Error 0.163873 0.100011 0.139158 Mean dependent var S.D. dependent var Akaike info criterion Schwarz criterion F-statistic Prob(F-statistic) t-Statistic 2.778280 7.307163 -1.880622 Prob. 0.0061 0.0000 0.0618 0.466826 0.922923 2.330510 2.386522 37.53259 0.000000

Both roots inside the unit circle. AR and MA coefficients significant. Some of the AC at larger lags are significant. LM test: TR 2 =5.29 < (22 ) = 5.99 does not reject no serial correlation in the error terms.

Notice that if you estimate an ARMA(1,2) the MA coefficient is insignificant. Together with the information criteria, we can rule out this specification. The LM test results also do not support this specification.
Variable C AR(1) MA(1) MA(2) R-squared Adjusted R-squared S.E. of regression Sum squared resid Log likelihood Durbin-Watson stat Inverted AR Roots Inverted MA Roots Coefficient 0.457550 0.807616 -0.280755 -0.152639 0.322970 0.310509 0.766355 95.72980 -190.4980 2.011448 .81 .56 -.27 Std. Error 0.176196 0.120152 0.148664 0.116160 Mean dependent var S.D. dependent var Akaike info criterion Schwarz criterion F-statistic Prob(F-statistic) t-Statistic 2.596826 6.721624 -1.888526 -1.314038 Prob. 0.0103 0.0000 0.0607 0.1907 0.466826 0.922923 2.329317 2.404000 25.91910 0.000000

ARMA (2,1) is better than ARMA(1,2) but does not dominate ARMA(1,1). The estimated coefficients are significant, but there is residual serial correlation. We can perform additional tests to check the stability of the coefficient, seasonal effects, etc.

Seasonality Cyclical movements observed in daily, monthly or quarterly data. Often it is useful to remove it if it is visible in lags s, 2s, 3s, . of the ACF and the PACF. You add an AR or MA coefficient at the appropriate lag. Possibilities: Quarterly seasonality with MA at lag 4:
xt = a1 xt 1 + et + b1et 1 + b4et 4

Quarterly seasonality with AR at lag 4:


xt = a1 xt 1 + a4 xt 4 + et + b1et 1

Multiplicative seasonality Accounts for interaction of the ARMA and seasonal effects. MA term at lag 1 interacting with the seasonal MA term at lag 4:
(1 a1L ) xt = (1 + b1 L)(1 + b4 L4 )et
xt = a1 xt 1 + et + b1et 1 + b4 et 4 + b1et 1b4 et 4

AR term at lag 1 interacting with the seasonal AR term at lag 4:


(1 a1 )(1 a4 L4 ) xt = (1 + b1 L)et
xt = a1 xt 1 + a4 xt 4 a1a4 xt 1 xt 4 + et + b1et 1

One solution to remove strong seasonality is to transform the data by seasonal differencing. For quarterly data:
yt = (1 L4 ) xt

If the data is not stationary, then also need to first difference it:
yt = (1 L)(1 L4 ) xt

Eviews: Several methodologies for seasonal adjustment


SeriesProcSeasonal Adjustment Census X12 Census 11 Moving Average Methods Tramo-Seat method

The first two options allow additive or multiplicative specifications, control for holidays, trading days, outliers. Homework: due September 26, 2007 Data: Enders Question 12.

Вам также может понравиться