Вы находитесь на странице: 1из 99

WORKSHOP

on

Introductory Econometrics with EViews

WORKSHOP on Introductory Econometrics with EViews Asst. Prof. Dr. Kemal Bağzıbağlı Department of Economic Res. Asst.
WORKSHOP on Introductory Econometrics with EViews Asst. Prof. Dr. Kemal Bağzıbağlı Department of Economic Res. Asst.

Asst. Prof. Dr. Kemal Bağzıbağlı

Department of Economic

Res. Asst. Pejman Bahramian

PhD Candidate, Department of Economic

Res. Asst. Gizem Uzuner

MSc Student, Department of Economic

EViews Workshop Series Agenda

1. Introductory Econometrics with EViews

2. Advanced Time Series Econometrics with EViews

a. Unit root test and cointegration

b. Vector Autoregressive (VAR) models

c. Structural Vector Autoregressive (SVAR) models

d. Vector Error Correction Models(VECM)

e. Autoregressive Distributed Lag processes

3. Forecasting, and Volatility Models with EViews

a. Forecasting

b. Volatility models

c. Regime Switching Models

2

Part 1 - Outline

1. Violation of Classical Linear Multiple Regression (CLMR) Assumptions

a. Heteroskedasticity

b. Multicollinearity

2. “Stationarity is Job 1!”

c. Model Misspecification

d. Autocorrelation

3. Univariate Time Series Modelling

a. Autoregressive Integrated Moving Average (ARIMA) model

3

1. Violation of Classical

Linear Multiple Regression (CLMR)

Assumptions

Multiple Regression Model

Deterministic components ● n observations on y and x : ● α & β i

Deterministic components

n observations on y and x:

Deterministic components ● n observations on y and x : ● α & β i :

α & β i : unknown parameters

Stochastic component

5

Assumptions

1) The error term (u t ) is a random variable with E(u t )=0. 2) Common (constant) Variance. Var(u t ) = σ 2 for all i. 3) Independence of u t and u j for all t. 4) Independence of x j

u t and x j are independent for all t and j.

5) Normality

u t are normally distributed for all t.

● In conjunction with assumptions 1, 2 and 3;

u t IN (0, σ 2 )

6

Violation of Basic Model Assumptions

HETEROSKEDASTICITY (nonconstant variance)

var(u t ) = E(u t 2 ) = σ 2 for all t (similar distribution) Homoskedasticity:

σ 2 for all t (similar distribution) Homoskedasticity: n ● σ 1 2 = σ 2

n

● σ 1 2 = σ 2 2 = … = σ 2

● Constant dispersion of the error terms around their mean zero

7

Heteroskedasticity (cont.)

Rapidly increasing or decreasing dispersion heteroskedasticity? Variances are different because of changing dispersion

σ 1 2 ≠ σ 2 2

2

σ 2 n

Var(u t )=

σ t One of the assumptions is violated!

1 2 ≠ σ 2 2 ≠ 2 ≠ σ 2 n Var(u t )= σ
1 2 ≠ σ 2 2 ≠ 2 ≠ σ 2 n Var(u t )= σ
1 2 ≠ σ 2 2 ≠ 2 ≠ σ 2 n Var(u t )= σ

8

Heteroskedasticity (cont.)

Heteroskedasticity (cont.) Residuals increasing by x heteroskedasticity? 9

Residuals increasing by x heteroskedasticity?

Heteroskedasticity (cont.) Residuals increasing by x heteroskedasticity? 9

9

Consequences of Heteroskedasticity

The ordinary least squares (OLS) estimators are still unbiased but inefficient.

Inefficiency: It is possible to find an alternative unbiased linear estimator that has a lower variance than the OLS estimator.

Consequences of Heteroskedasticity (cont.) Effect on the Tests of Hypotheses

The estimated variances and covariances of the OLS estimators are biased and inconsistent

invalidating the tests of hypotheses (significance)

Effect on Forecasting

Forecasts based on the estimators will be unbiased

Estimators are inefficient

forecasts will also be inefficient

11

Lagrange Multiplier (LM) Tests for Heteroskedasticity

1. Park Test is a two-stage procedure

Stage 1:

● Run an OLS regression disregarding the heteroskedasticity question.

● Obtain u t from this regression;

question. ● Obtain u t from this regression; Stage 2: ● if β is statistically significant,

Stage 2:

● if β is statistically significant, there is heteroskedasticity.

12

Park Test in EViews

Park Test in EViews ls compensation c productivity 13

ls compensation c productivity

Park Test in EViews ls compensation c productivity 13
Park Test in EViews ls compensation c productivity 13

13

Park Test in EViews (cont.)

Park Test in EViews (cont.) 14
Park Test in EViews (cont.) 14

Park Test in EViews (cont.)

Park Test in EViews (cont.) u=0 15
Park Test in EViews (cont.) u=0 15

u=0

15

Park Test in EViews (cont.)

u2=u^2

Park Test in EViews (cont.) u2=u^2 16
Park Test in EViews (cont.) u2=u^2 16
Park Test in EViews (cont.) u2=u^2 16

Park Test in EViews (cont.)

lnu2=log(u2)

lnu2=log(u2)

lnproductivity=log(productivity)

Park Test in EViews (cont.) lnu2=log(u2) lnproductivity=log(productivity) 17
Park Test in EViews (cont.) lnu2=log(u2) lnproductivity=log(productivity) 17

Park Test in EViews (cont.)

Park Test in EViews (cont.) ● Probability value (p-value) of lnproductivity (0.5257) is greater than the

Probability value (p-value) of lnproductivity (0.5257) is greater than the critical value of 0.05 Statistically insignificant homoskedasticity

●

Detection of Heteroskedasticity (cont.)

2. Glejser Test is similar in spirit to the Park test.

Glejser (1969) suggested estimating regressions

of the type; Iû t I = α + βX t t I = α + β/X t t I = α + β√X t and so on

 

● Testing the hypothesis

β=0

19

Glejser Test in EViews

genr au=@abs(u)

Glejser Test in EViews genr au=@abs(u) 20
Glejser Test in EViews genr au=@abs(u) 20
Glejser Test in EViews genr au=@abs(u) 20

Glejser Test in EViews (cont.)

Glejser Test in EViews (cont.) ls au c productivity Heteroskedasticity? 21

ls au c productivity

Heteroskedasticity?

Glejser Test in EViews (cont.) ls au c productivity Heteroskedasticity? 21

Glejser Test in EViews (cont.)

ls au c 1/productivity

ls au c @sqrtproductivity

Glejser Test in EViews (cont.) ls au c 1/productivity ls au c @sqrtproductivity 22

22

Glejser Test in EViews (cont.)

ls compensation c productivity

Glejser Test in EViews (cont.) ls compensation c productivity 23
Glejser Test in EViews (cont.) ls compensation c productivity 23
Glejser Test in EViews (cont.) ls compensation c productivity 23

23

Glejser Test in EViews (cont.)

Glejser Test in EViews (cont.) 24
Glejser Test in EViews (cont.) 24

24

Detection of Heteroskedasticity (cont.)

3. White’s Test

● Recommended over all the previous tests

White’s Test ● Recommended over all the previous tests Step 1: Obtain Step 2: Compute the
White’s Test ● Recommended over all the previous tests Step 1: Obtain Step 2: Compute the

Step 1: Obtain

Step 2: Compute the residual and square it

by OLS

Detection of Heteroskedasticity (cont.)

3. White’s Test (cont.) Step 3: Regress the squared residual against a constant, X 2t , X 3t etc. (auxiliary equation)

constant, X 2 t , X 3 t etc. (auxiliary equation) Step 4: Compute the statistic

Step 4: Compute the statistic nR 2

● n: sample size, R 2 : unadjusted R 2 from S.3

Detection of Heteroskedasticity (cont.)

3. White’s Test (cont.) Step 5: Reject the null hypothesis that

Test (cont.) Step 5: Reject the null hypothesis that ● if ○ Upper a percent point

● if

○ Upper a percent point on the chi-square dist. with 5 d.f.

● If the null hypothesis is not rejected

○ the residuals are homoskedastic

White Test in EViews

White Test in EViews 28
White Test in EViews 28

Solutions to the Heteroskedasticity Problem

Taking the logarithm of Y t and X t

variance becomes smaller.

of Y t and X t ◆ variance becomes smaller. ➔ Use the weighted least squares

Use the weighted least squares (WLS)

Better than the first solution

Guaranties homoskedasticity.

29

Solutions to the Heteroskedasticity Problem (cont.)

Graphical Method

to the Heteroskedasticity Problem (cont.) Graphical Method ● Check the residuals (i.e. error variance) ○

● Check the residuals (i.e. error variance)

linearly increasing with x t

● WLS

Graphical Method ● Check the residuals (i.e. error variance) ○ linearly increasing with x t ●

30

Solutions to the Heteroskedasticity Problem (cont.)

Solutions to the Heteroskedasticity Problem (cont.) ● Not linearly but quadratically increasing error variance 31

● Not linearly but quadratically increasing error variance

● Not linearly but quadratically increasing error variance

31

Solutions to the Heteroskedasticity Problem (cont.)

Solutions to the Heteroskedasticity Problem (cont.) ● Error variance decreasing linearly 32
● Error variance decreasing linearly
● Error variance decreasing
linearly

32

Applications with EViews

ls foodexp c totalexp

foodexp c totalexp

01.makeresid u

Applications with EViews ls foodexp c totalexp foodexp c totalexp 01.makeresid u 33

Applications with EViews (cont.)

Command: scat totalexp u heteroskedasticity?

Command:

scat totalexp u

Command: scat totalexp u heteroskedasticity?

heteroskedasticity?

34

Applications with EViews (cont.)

Applications with EViews (cont.) 35

35

Applications with EViews (cont.)

lnfoodexp=log(foodexp)

lntotalexp=log(totalexp)

Applications with EViews (cont.) lnfoodexp=log(foodexp) lntotalexp=log(totalexp) 36
Applications with EViews (cont.) lnfoodexp=log(foodexp) lntotalexp=log(totalexp) 36

Applications with EViews (cont.)

Command: ls lnfoodexp c lntotalexp

Command:

ls lnfoodexp c lntotalexp

37

Applications with EViews (cont.)

Applications with EViews (cont.) 38

Multicollinearity

OLS

OLS Ordinary Least Squares BLUE classical normal linear Independent variables in the regression model are not
OLS Ordinary Least Squares BLUE classical normal linear Independent variables in the regression model are not

Ordinary Least Squares

BLUE classical normal linear

OLS Ordinary Least Squares BLUE classical normal linear Independent variables in the regression model are not

Independent variables in the regression model are not correlated.

What is Multicollinearity?

● The problem of multicollinearity arises when the explanatory variables have approximate linear relationships.

○ i.e. explanatory variables move closely together

● In this situation, it would be difficult to isolate the partial effect of a single variable. WHY?

closely together ● In this situation, it would be difficult to isolate the partial effect of

40

Multicollinearity (cont.)

1. Exact (or Perfect) Multicollinearity

a. Linear relationship among the independent variables

2. Near Multicollinearity

a. Explanatory variables are approximately linearly related

2. Near Multicollinearity a. Explanatory variables are approximately linearly related For example; If ➡ Exact ➡

For example; If

Exact

Near

41

Theoretical Consequences of Multicollinearity

Unbiasedness & Forecasts

OLS estimators are still BLUE and MLE and hence are unbiased, efficient and consistent.

Forecasts are still unbiased and confidence intervals are valid

Although the standard errors and t-statistics of regression coefficients are numerically affected,

○ tests based on them are still valid

42

Theoretical Consequences of Multicollinearity (cont.)

Standard Errors

Standard errors tend to be higher

○ making t-statistics lower

○ thus making coefficients less significant (and possibly even insignificant)

43

Identifying Multicollinearity

● High R 2 with low values for t-statistics
● High values for correlation coefficients
● Regression coefficients sensitive to specification Formal test for multicollinearity

● ○ CI=√k
CI=√k

Eigenvalues and condition index (CI)

k= max eigenvalues/min eigenvalues

k is between 100 and 1000 multicollinearity?

High variance inflation factor (VIF)

➡ multicollinearity? High variance inflation factor (VIF) ➡ VIF>10 ➡ THEN multicollinearity is suspected.

VIF>10 THEN multicollinearity is suspected.

44

Solutions to the Multicollinearity Problem

● Benign Neglect

○ Less interested in interpreting individual coefficients but more interested in forecasting

● Eliminating Variables

○ The surest way to eliminate or reduce the effects of multicollinearity

45

Solutions to the Multicol. Problem (cont.)

● Reformulating the Model

○ In many situations, respecifying the model can reduce multicollinearity

● Reformulating the Model ○ In many situations, respecifying the model can reduce multicollinearity

46

Solutions to the Multicol. Problem (cont.)

● Using Extraneous Information

○ Often used in the estimation of demand functions

○ High correlation between time series data on real income and the price level

■ Making the estimation of income and price elasticities difficult

○ Estimate the income elasticity from cross-section studies

and then use that information in the time series model to estimate the price elasticity

47

Solutions to the Multicol. Problem (cont.)

● Increasing the Sample Size

○ reduces the adverse effects of multicollinearity

R 2 , including the new sample

■ goes down or remains approx. the same

● the variances of the coefficients will indeed decrease and counteract the effects of multicollinearity

■ goes up substantially

there may be no benefit to adding to the sample size

48

Applications with EViews

Applications with EViews Overall statistically significant but one by one statistically insignificant

Overall statistically significant

but one by one statistically insignificant

multicollinearity

problem

with EViews Overall statistically significant but one by one statistically insignificant multicollinearity problem 49

49

Applications with EViews (cont.)

Command: eq01.varinf

Command: eq01.varinf
Command: eq01.varinf
Applications with EViews (cont.) Command: eq01.varinf 50

50

Applications with EViews (cont.)

Applications with EViews (cont.) Command: scalar ci= @sqrt(66795998/3.44E-06) CI: Condition Index 51

Command: scalar ci= @sqrt(66795998/3.44E-06)

CI: Condition Index
CI: Condition Index

51

Applications with EViews (cont.)

Applications with EViews (cont.) The highest correlation is between the price of cars and the general
Applications with EViews (cont.) The highest correlation is between the price of cars and the general

The highest correlation is between the price of cars and the general price level.

is between the price of cars and the general price level. Even if we drop these
is between the price of cars and the general price level. Even if we drop these
is between the price of cars and the general price level. Even if we drop these

Even if we drop these variables one-by-one from the model, still we have a multicollinearity problem.

52

Applications with EViews (cont.)

Applications with EViews (cont.) ● When we drop both the general price level and the price

● When we drop both the general price level and the price of cars, the multicollinearity problem is solved

○ but R 2 is low.

● So we check the second highest correlation between disposable income and price level.

○ but R 2 is low. ● So we check the second highest correlation between disposable

Applications with EViews (cont.)

Applications with EViews (cont.) DROP: General price level and disposable income After removing the variables, the

DROP: General price level and disposable income

After removing the variables, the problem is solved.

Loss of valuable information?

It is better to try solving the problem by increasing the sample size

is solved. Loss of valuable information? It is better to try solving the problem by increasing
is solved. Loss of valuable information? It is better to try solving the problem by increasing

54

Model Misspecification

1. Omitting Influential or Including Non- Influential Explanatory Variables

2. Various Functional Forms

3. Measurement Errors

4. Tests for Misspecification

5. Approaches in Choosing an Appropriate Model

55

The Ramsey RESET Test

RESET: Regression specification error test

Step 1: Estimate the model that you think is correct and obtain the fitted values of Y, call them

is correct and obtain the fitted values of Y , call them Step 2: Estimate the

Step 2: Estimate the model in Step 1 again, this

time include variables.

of Y , call them Step 2: Estimate the model in Step 1 again, this time

as additional explanatory

56

The Ramsey RESET Test (cont.)

Step 3: The model in Step 1 is the restricted model and the model in Step 2 is the unrestricted model.

Calculate the F-statistic for these two models.

● i.e. carry out a Wald F-test for the omission of the two new variables in Step 2

● If the null hypothesis (H 0 : the new variables have no effect)

● If the null hypothesis (H 0 : the new variables have no effect) is rejected

is rejected

indication of a specification error

57

Autocorrelation

In the presence of autocorrelation, cov( u t ,u s )≠0 for t≠s and the error for period t is correlated with the error for period s.

t is correlated with the error for period s . ● ● -1< ρ <1 ○
t is correlated with the error for period s . ● ● -1< ρ <1 ○

-1< ρ <1

○ ρ approaching 0

○ ρ approaching +1

○ ρ approaching -1

○ ρ approaching 0 ○ ρ approaching +1 ○ ρ approaching -1 no correlation positive correlation

no correlation positive correlation negative correlation

58

Autocorrelation (cont.)

Autocorrelation (cont.) 59

Causes of Autocorrelation

DIRECT

INDIRECT

● Inertia or Persistence

● Omitted Variables

● Spatial Correlation

● Functional Form

● Cyclical Influences

● Seasonality

Consequences of Autocorrelation

● OLS estimates are still unbiased and consistent

● OLS estimates are inefficient

unbiased and consistent ● OLS estimates are inefficient not BLUE ○ Forecasts will also be inefficient

not BLUE

○ Forecasts will also be inefficient

● The same as the case of ignoring heteroskedasticity

● Usual formulas give incorrect standard errors for OLS estimates

● Confidence intervals and hypothesis tests based on the usual standard errors are not valid

61

Detecting Autocorrelation

Runs Test: Investigate the signs of the residuals. Are they moving randomly? (+) and (-) comes randomly don’t need to suspect autocorrelation problem.

Durbin-Watson (DW) d Test: Ratio of the sum of squared differences in successive residuals to the residual sum of squares.

Breusch-Godfrey LM Test: A more general test which does not assume the disturbance to be AR(1).

sum of squares. ❖ Breusch-Godfrey LM Test: A more general test which does not assume the

62

Durbin-Watson d Test

Durbin-Watson d Test STEP 1 Estimate the model by OLS and compute the residuals u t
STEP 1 Estimate the model by OLS and compute the residuals u t STEP 2

STEP 1 Estimate the model by OLS and compute the residuals u t

STEP 2 Compute the Durbin-Watson d statistic:

STEP 1 Estimate the model by OLS and compute the residuals u t STEP 2 Compute

63

Durbin-Watson d Test (cont.)

STEP 3 Construct the table with the calculated DW statistic and the d U ,
STEP 3 Construct the table with the calculated DW
statistic and the d U , d L , 4-d U and 4-d L critical values.
STEP 4 Conclude

64

Resolving Autocorrelation

The Cochrane-Orcutt Iterative Procedure

Step 1: Estimate the regression and obtain residuals.

Step 2: Estimate the first-order serial correlation coefficient () from regressing the residuals to its lagged terms.

( ⍴ ) from regressing the residuals to its lagged terms. Step 3: Transform the original

Step 3: Transform the original variables as follows:

( ⍴ ) from regressing the residuals to its lagged terms. Step 3: Transform the original

65

Resolving Autocorrelation (cont.)

Step 4: Run the regression again with the transformed variables and obtain a new set of residuals. Step 5 and on: Continue repeating Steps 2 to 4 for several rounds until the following stopping rule applies:

● the estimates of from two successive iterations differ by no more than some preselected small value, such as 0.001.

66

Applications with EViews

Applications with EViews Variables in natural logarith: ● LNCO: Copper price ● LNIN: Inudtrial production ●

Variables in natural logarith:

● LNCO: Copper price

● LNIN: Inudtrial production

● LNLON: London stock exchange

● LNHS: Housing price

● LNAL: Aluminium price

● LNLON: London stock exchange ● LNHS: Housing price ● LNAL: Aluminium price 1.143 1.739 AUTOCORRELATION?
1.143 1.739 AUTOCORRELATION?
1.143
1.739
AUTOCORRELATION?

Applications with EViews (cont.)

Applications with EViews (cont.) H 0 : No autocorrelation 68
Applications with EViews (cont.) H 0 : No autocorrelation 68

H 0 : No autocorrelation

68

Applications with EViews (cont.)

To Fix it!

Applications with EViews (cont.) To Fix it! 69
Applications with EViews (cont.) To Fix it! 69

69

Applications with EViews (cont.)

To Fix it!

u=u(0)

Applications with EViews (cont.) To Fix it! u=u(0) 70
Applications with EViews (cont.) To Fix it! u=u(0) 70
Applications with EViews (cont.) To Fix it! u=u(0) 70

Applications with EViews (cont.)

To Fix it!

Generate series:

● y= lnco-0.52*lnco(-1)

● x 2 = lnin-0.52*lnin(-1)

● x 3 = lnlon-0.52*lnlon(-1)

● x 4 = lnhs-0.52*lnhs(-1)

● x 5 = lnal-0.52*lnal(-1)

Applications with EViews (cont.)

To Fix it!

1.124

1.743

Command:

ls y c x2 x3 x4 x5

Applications with EViews (cont.) To Fix it! 1.124 1.743 Command: ls y c x2 x3 x4
Applications with EViews (cont.) To Fix it! 1.124 1.743 Command: ls y c x2 x3 x4

72

Applications with EViews (cont.)

To Fix it!

Applications with EViews (cont.) To Fix it! 73
Applications with EViews (cont.) To Fix it! 73

73

Summary

Problem

Source

Detection

Remedy

Heteroskedasticity

Nonconstant variance

Park Test, Glejser, White Test

Taking logarithm, Weighted least squares

Autocorrelation

E(u t ,u t-1 )≠0

Durbin-Watson d Test, Run Test, Breusch Godfrey LM Test

Cochrane-Orcutt Iterative Procedure and GLS

Multicollinearity

Interdependence of x j

● High R 2 but few significant t ratios

● Reformulating the model

● High pairwise correlation between independent variables

● Dropping variables,

● Additional new data

● Faitor analysis

● Eigenvalues and condition index, High VIF, Auxiliary Regressions

● Principal comp. analysis

2. “Stationarity is Job 1!”

What is Stationarity?

● A stationary series can be defined as one with a

constant mean, constant variance and constant autocovariances

for each given lag.

● The mean and/or variance of nonstationary series are time dependent.

● The correlation between a series and its lagged values depend only on the length of the lag and not on when the series started.

● A series that is integrated of order zero, i.e. I(0).

76

noise series of a
noise
series
of a

Example of a

white

process

Time

plot

random walk vs. a random walk with drift

77

Example

PDI: Personal Disposable Income

Example PDI: Personal Disposable Income 78
Example PDI: Personal Disposable Income 78
Example PDI: Personal Disposable Income 78

78

What is Stationarity? (cont.)

● If a regression model is not stationary,

the usual “t-ratios” will not follow a t-distribution.

The use of nonstationary data can lead to spurious regressions.

● Results of the regression do not reflect the real relationship except if these variables are cointegrated.

79

3. Univariate Time Series Modelling

Some Stochastic Processes

Random Walk

Moving Average Process

Autoregressive Process

Some Stochastic Processes Random Walk Moving Average Process Autoregressive Process Autoregressive Moving Average Process

Autoregressive Moving Average Process

Some Stochastic Processes Random Walk Moving Average Process Autoregressive Process Autoregressive Moving Average Process
Processes Random Walk Moving Average Process Autoregressive Process Autoregressive Moving Average Process 81
Processes Random Walk Moving Average Process Autoregressive Process Autoregressive Moving Average Process 81

81

Autoregressive Integrated MA Process

Most time series are nonstationary

Successive differencing

Most time series are nonstationary Successive differencing stationarity : A stationary series that can be expressed

stationarity

are nonstationary Successive differencing stationarity : A stationary series that can be expressed by an ARMA(p,

: A stationary series that can be expressed by an ARMA(p, q)

: A stationary series that can be expressed by an ARMA(p, q) can be represented by

can be represented by an ARIMA model ARIMA(p, d, q)

82

Estimation and Forecasting with an ARIMA Model

The Box and Jenkins (1970) Approach

● Identification

● Fitting (Estimation), usually OLS

● Diagnostics

● Refitting if necessary

● Forecasting

83

Identification

● The process of specifying the orders of differencing, AR modeling, and MA modeling

● How do the data look like?

● What pattern do the data show?

- Are the data stationary?

- Specification of p, d, and q?

● Tools

- Plots of data

- Autocorrelation Function (ACF)

- Partial ACF (PACF)

84

Identification (cont.)

● To determine the value of p and q we use the graphical properties of the autocorrelation function and the partial autocorrelation function.

● Again recall the following:

properties of the autocorrelation function and the partial autocorrelation function. ● Again recall the following: 85

85

Model Fitting

● Model parameters are estimated by OLS

● Output includes

Parameter estimates

Test statistics

Goodness of fit measures

Residuals

Diagnostics

86

Diagnostics

● Determines whether the model fits the data adequately.

○ The aim is to extract all information and ensure that residuals are white noise

● Key measures

○ ACF of residuals

○ PACF of residuals

○ Ljung-Box Pierce Q statistic

87

Preliminary Analysis with EViews

Select the series “dividends” in the workfile, then select [Quick/Graph/Line graph]:

Analysis with EViews Select the series “ dividends ” in the workfile, then select [ Quick/Graph/Line
Analysis with EViews Select the series “ dividends ” in the workfile, then select [ Quick/Graph/Line

88

Preliminary Analysis with EViews (cont.)

[Quick/Generate Series]:

Preliminary Analysis with EViews (cont.) [ Quick/Generate Series ]: ddividends=d(dividends)

ddividends=d(dividends)

Preliminary Analysis with EViews (cont.) [ Quick/Generate Series ]: ddividends=d(dividends)

89

Preliminary Analysis: Identification Correlogram ● The graph of autocorrelation function against s, for s =
Preliminary Analysis: Identification
Correlogram
The graph of autocorrelation function
against s, for s = 0, 1, 2, …, t-1
● Useful diagram for identifying patterns
in correlation among series.
● Useful guide for determining how
correlated the error term (u t ) is to the
past errors u t-1 , u t-2 ,
90

Preliminary Analysis: Identification

Interpretation of Correlogram

● If is high, correlogram for AR (1) declines slowly over time

○ First differencing is indicated

91

Preliminary Analysis: Identification Interpretation of Correlogram ● The function quickly decreases to zero (a low
Preliminary Analysis: Identification
Interpretation of Correlogram
The function quickly decreases
to zero (a low ⍴)

92

Correlogram and Stationarity

Correlogram and Stationarity 93
Correlogram and Stationarity 93

Preliminary Analysis: Estimation

ARIMA(1,1,1) Command: ls ddividens c AR(1) MA(1)
ARIMA(1,1,1)
Command: ls ddividens c AR(1) MA(1)

94

Empirical Example

Forecasting Monthly Electricity Sales

Total System Energy Demand
Total System Energy Demand

95

Empirical Example (cont.)

Forecasting Monthly Electricity Sales

Correlogram for Monthly Electricity Sales Data

Empirical Example (cont.) Forecasting Monthly Electricity Sales Correlogram for Monthly Electricity Sales Data 96

96

Empirical Example (cont.)

Forecasting Monthly Electricity Sales Correlogram for 12-Month Differenced Data

(X t -X t-12 )

(cont.) Forecasting Monthly Electricity Sales Correlogram for 12-Month Differenced Data ( X t - X t-12

97

Empirical Example (cont.)

Forecasting Monthly Electricity Sales

Box-Jenkins Forecast of System Energy ARMA Order AIC RMSE (1, 1) 1,930 320 (4, 1)
Box-Jenkins Forecast of System Energy
ARMA Order
AIC
RMSE
(1, 1)
1,930
320
(4, 1)
1,927
312
(1, 4)
1,926
311
(0, 4)
1,924
311
RMSE: Root mean squared error
Superior model:
ARIMA (0, 1, 4)

98

Bibliography

● Brooks, C. (2008) Introductory Econometrics for Finance,

● Gujarati D.N., Porter D.C. (2004), Basic Econometrics,The McGraw−Hill Companies

● Maddala, G.S. (2002). Introduction to Econometrics.

● Ramanathan, R. (2002). Introductory econometrics with applications, Thomson Learning. Mason, Ohio, USA.

● Wooldridge,J. (2000) Introductory Econometrics: A modern Approach. South-Western College Publishing

99