Вы находитесь на странице: 1из 18

# STATIONARITY

Introduction
 If random variable X is indexed to time, usually denoted by t,
the observationsX t , t  

##  Definition 1: (Autocovariance function). The autocovariance

function of a time series {Xt} with Var(Xt)<∞ is defined by

##  k  t,s   Cov  X t , X s   E  X t   t  X s  s  

Let’s begin with wonderful quote…
 Thomson 1994 (on G.P. Nason):

## “Experience with the real-world data, however, soon

convinces one both stationarity and Gaussianity are fairy
tales invented for the amusement of undergraduates”
Stationarity
 In IID world, the distribution of each observation is the
same as the distribution of every other data point. It
would be nice to have something like this for time series.
This property is called stationarity.

##  Stationarity doesn’t mean that the time series never

changes, but that its distribution doesn’t.

 Stationarity:
 Strictly /strongly stationarity
 Weak/ loose (wide-sense) stationarity
Wide-sense Stationarity (Secondary stationarity)
 Definition 2: The time seriesX t , t   ( is the integer set) is
said to be stationary if:
(i) E  X 2t   ; t 
(ii) E  X t   ; t 
(iii)  X  t,s    X  t  h,s  h  ; t,s, h 

##  In other words, stationary time series {Xt} must have three

feature: finite variation, constant first moment, the second
moment γ(t,s) only depends on (s-t) and not depend on s or t.
 Weak stationarity means that mean and the variance of a
stochastic process do not depend on t (that is they are
constant)
Strictly Stationarity
 Definition 3: The time series X t , t   is said to be strict
stationary if the joint distribution of X t , X t ,..., X t
1 2 k
 is the
same as that of X t h , X t h ,..., X t
1 2 k h

##  In other words: strict stationary doesn’t depend on h

(lag).
Autocovariance Functions
 In modeling finite number of random variables, a covariance
matrix is usually computed to summarize the dependence
between these variables. For a time series X t t  , we need

variables.

##  Definition 1: (Autocovariance function). The autocovariance

function of a time series {Xt} with Var(Xt)<∞ is defined by

 k  t, t  k   Cov  X t , X t k   E  X t   t  X t k   t k  
 For stationary process {Xt}, then we have the mean
E(Xt)=E(Xt+k)=μ and variance Var(Xt)=σ2. Hence, the
autocovariance:

 k  t, t  k   Cov  X t , X t k   E  X t    X t k    
Autocorrelation Function
 The autocorrelation function (ACF) is
k
k  Corr  X t , X t k  
0
for integers k, 0  Cov  X t , X t k   Var  X t 
so that:
E  X t    X t k    
k 
E  Xt  
2
PROPERTIES THE AUTOCOVARIANCE AND THE
AUTOCORRELATION FUNCTIONS
PROPERTIES:
1.  0  Var  X t   0  1.
2.  k   0  k  1.
3.  k    k and k   k , k.
4. (necessary condition) k and k are positive semi-
definite
n n
   i j ti t j  0
i 1 j 1
n n
   i j  ti t j  0
i 1 j 1

for any set of time points t1,t2,…,tn and any real numbers 1,2,…,n.
10
THE PARTIAL AUTOCORRELATION FUNCTION (PACF)

##  PACF is the correlation between Xt and Xt-k after their

mutual linear dependency on the intervening variables Xt-1,
Xt-2, …, Xt-k+1 has been removed.
 The conditional correlation
Corr  X t , X t k X t 1 , X t 2 , , X t k 1   kk

## is usually referred as the partial autocorrelation in time

series.
e.g., 11  Corr  X t , X t 1   1
22  Corr  X t , X t 2 X t 1 
11
CALCULATION OF PACF
1. REGRESSION APPROACH: Consider a model

X t k  k1X t k 1  k 2 X t k 2   kk X t  et k

## from a zero mean stationary process where ki denotes

the coefficients of Xtk+i and etk is the zero mean error
term which is uncorrelated with Xtk+i, i=0,1,…,k.

## X t k X t k j  k1X t k1X t k j   kk X t X t k  j  et k X t k  j

12
CALCULATION OF PACF
and taking the expectations

## diving both sides by 0

 j  k1 j 1  k 2 j 2    kk j k
 j  k1 j 1  k 2  j 2    kk  j k

PACF

13
CALCULATION OF PACF
 For j=1,2,…,k, we have the following system of equations

1  k1  k 2 1    kk  k 1
 2  k11  k 2    kk  k 2

 k  k1 k 1  k 2  k 2    kk

14
CALCULATION OF PACF
 Using Cramer’s rule successively for k=1,2,…

11  1
1 1
1  2  2  12
22  
1 1 1  12
1 1

15
CALCULATION OF PACF

1 1 2   k 2 1
1 1 1   k 3 2
     
 k 1  k 2  k 3  1 k
kk 
1 1 2   k 2  k 1
1 1 1   k 3  k 2
     
 k 1  k 2  k 3  1 1
16
CALCULATION OF PACF
2. Levinson and Durbin’s Recursive Formula:
k 1
 k   k 1, j  k  j
j 1
kk  k 1
1   k 1, j  k  j
j 1

## where kj  k 1, j  kkk 1,k  j , j  1,2,, k  1.

17
Reference
 STAT 497_LN2
 G.P. Nanson, Stationary and non-stationary time series
(chapter 26).

## Нижнее меню

### Получите наши бесплатные приложения

Авторское право © 2021 Scribd Inc.