Академический Документы
Профессиональный Документы
Культура Документы
Christian Brownlees
(Brownlees) 0/1
Introduction
Introduction
Introduction
Introduction
(Brownlees) 1/1
Introduction
In these slides...
(Brownlees) 2/1
Introduction
(Brownlees) 3/1
Introduction
(Brownlees) 4/1
Introduction
Roadmap
Basic Concepts
Network techiques for the analysis of econ and financial panels
Network for Static Data
Partial Correlation Network
(Brownlees) 5/1
Basic Concepts
Basic Concepts
Basic Concepts
What is a Network?
What is a Network?
Mathematically, a network is a graph.
(Brownlees) 6/1
Basic Concepts
Graphs
G = (V, E)
(Brownlees) 7/1
Basic Concepts
D B
In maths, graphs turn out to be more convenient to represent and analyse a number of problems.
One of the early examples of graph theory applications is the Königsberg Bridge Problem. The
problem consists of finding out whether it exists a path that crosses the 7 bridges exactly once
that begins and finishes on the same vertex. In 1736, Leonard Euler showed that such path does
not exist using graph theory.
(Brownlees) 8/1
Basic Concepts
Types of Graphs
(Brownlees) 9/1
Basic Concepts
Undirected Graphs
If the edges do not have a directionality the graph is undirected
(i.e. an edge from i to j is the same as and edge from j to i )
Example:
E B
D C
V = {A, B, C , D, E }
E = {{A, B}, {B, C }, {A, D}, {D, E }}
(Brownlees) 10/1
Basic Concepts
Directed Graphs
If the edges have a directionality the graph is directed
(i.e. the edge from i to j is different from an edge from j to i )
Example:
E B
D C
V = {A, B, C , D, E }
E = {(A, B), (B, C ), (A, D), (D, E )}
(Brownlees) 11/1
Basic Concepts
E B
D C
(Brownlees) 12/1
Basic Concepts
(Brownlees) 13/1
Basic Concepts
Network Representation
diagonal.
The Laplacian LG :
LG = DG − AG .
(Brownlees) 14/1
Basic Concepts
Network Representation
Adjacency Matrix
A
0 1 0 1 0
E B
1 0 1 0 0
AG =
0 1 0 0 0
1 0 0 0 1
D C 0 0 0 1 0
(Brownlees) 15/1
Basic Concepts
Network Representation
Degree Matrix
A
2 0 0 0 0
E B
0 2 0 0 0
DG =
0 0 1 0 0
0 0 0 2 0
D C 0 0 0 0 1
(Brownlees) 15/1
Basic Concepts
Network Representation
Laplacian Matrix
A
2 −1 0 −1 0
E B
−1 2 −1 0 0
LG =
0 −1 1 0 0
−1 0 0 2 −1
D C 0 0 0 −1 1
(Brownlees) 15/1
Basic Concepts
We will not have time to dig into the properties of these matrices
(Brownlees) 16/1
Basic Concepts
It turns out that in real world networks there are a large number
of patterns that are commonly encountered
(Brownlees) 17/1
Basic Concepts
Hubs
In many real world networks there are typically vertices that are
“more important” than others.
(Brownlees) 18/1
Basic Concepts
(Brownlees) 19/1
Basic Concepts
Community Structure
(Brownlees) 20/1
Networks for Panels of Economic and Financial Time Series
yn t
(Brownlees) 21/1
Networks for Panels of Economic and Financial Time Series
y1 t
y5 t y2 t
y4 t y3 t
(Brownlees) 22/1
Networks for Panels of Economic and Financial Time Series
Dimensionality Reduction
Analysing and understanding the properties of large dimensional
system is challenging. Network representation can be used as a
dimensionality reduction technique that can hence interpretation.
Regularised Estimation
It turns out that (most) network estimation techniques boil down
to the estimation of large dimensional model subject to
appropriate regularization constraints. In a large dimensional
setting regularization can enhance efficiency.
(cf. Ledoit and Wolf, 2004)
(Brownlees) 23/1
Networks for Panels of Economic and Financial Time Series
(Brownlees) 24/1
Networks for Panels of Economic and Financial Time Series
Network Definitions
linear, dynamic
Granger Network (Billio et al., 2012), Interconnectedness Table (Diebold and Yilmaz,
2014), NETS (Barigozzi and Brownlees, 2016)
nonlinear, contemporaneous
SKEPTIC (Liu et al., 2012), Tail Networks (Hautsch et al., 2012)
(Brownlees) 25/1
Partial Correlation Network
yt ∼ D(0, Σ)
(Brownlees) 26/1
Partial Correlation Network
(Brownlees) 27/1
Partial Correlation Network
y1 t = c + θ1 2 y2 t + θ1 3 y3 t + θ1 4 y4 t + θ1 5 y5 t + u1 t
(Brownlees) 29/1
Partial Correlation Network
Let kij denote the (i, j) element of K. Then the relation between θij
2
and σ(−i) is given by the following relations
s
kij kii
θij = − = ρij
kii kjj
and
2 1
σ(−i) =
kii
(Brownlees) 30/1
Partial Correlation Network
A Deeper Look...
2
The relation between θij and σ(−i) is given by the following
relations s
kij kii
θij = − = ρij
kii kjj
and
2 1
σ(−i) =
kii
It is interesting and straightforward to show this result
(Brownlees) 31/1
Partial Correlation Network
Proof: Step 1 of 3
Partition yt in two subvectors
" #
y1t
yt =
y2t
Define
" # " #
Σ11 Σ12 −1 Σ11 Σ12
Σ= K=Σ =
Σ21 Σ22 Σ21 Σ22
Finally, let y1t = yit and y2t = y(−i) t = (y1 , yi−1 , yi−2 , yN )0
(Brownlees) 32/1
Partial Correlation Network
Proof: Step 2 of 3
θ as a function of Σ
0 = Cov(ui , y(−i)t )
= Cov(yi − θ0 y(−i)t , y(−i)t )
= Cov(yi , y(−i)t ) − θ0 Cov(y(−i)t , y(−i)t )
= Σ12 − θ0 Σ22
θ = Σ−1
22 Σ21
2
σ(−i) as a function of Σ
Var(ui ) = Cov(yi − θ0 y(−i)t , yi − θ0 y(−i)t )
= Cov(yi , yi − θ0 y(−i)t )
= Cov(yi , yi ) − θ0 Cov(yi , y(−i)t )
2
σ(−i) = Σ11 − Σ12 Σ−1
22 Σ21
(Brownlees) 33/1
Partial Correlation Network
Proof: Step 3 of 3
2
σ(−i) as a function of K
Σ11 = 2
(σ(−i) )−1
2 1
σ(−i) =
kii
θ as a function of K
Σ21 = −Σ−1
22 Σ21 C
−1
Σ21 = −θΣ11
h i−1
θ = −Σ21 Σ11
kij
θj = −
kii
(Brownlees) 34/1
Partial Correlation Network
(Brownlees) 35/1
Partial Correlation Network
Sparse Estimation
(Brownlees) 36/1
Partial Correlation Network
Sparse Estimation
The workhorse of sparse estimation is the LASSO
Yt = θ0 Xt + et et ∼ N (0, σ 2 ) t = 1, ..., T
Xt ∈ RP
(Brownlees) 37/1
Partial Correlation Network
Sparse Estimation
The (classic) LASSO estimator of this model is defined as
T P
(Yt − θ0 Xt )2 + λ
X X
θ̂λL = arg min |θj | λ≥0
θ
i=1 j=1
Remarks:
LASSO is a shrinkage type estimator. When λ = 0 the estimator
coincides with least squares. When λ is large the effect of the
penalty is to shrink estimates towards zero (like Ridge regression)
subject to
|θ1 | + |θ2 | ≤ rλ
(Brownlees) 39/1
Partial Correlation Network
(Brownlees) 40/1
Partial Correlation Network
(Brownlees) 42/1
Partial Correlation Network
LASSO Estimator
Highlights of LASSO:
1 Sparsity Detection. The effect of the absolute value penalty is to
shrink some of the θ estimated coefficients to exact zeros. Under
appropriate conditions, LASSO can detect asymptotically the true
nonzero parameters of the model.
(Brownlees) 43/1
Partial Correlation Network
(Brownlees) 44/1
Partial Correlation Network
LASSO Properties
Selection Consistency.
P sign(θ̂λL i ) = sign(θ0 i ) → 1
(Brownlees) 45/1
Partial Correlation Network
There are several variants of the LASSO which tackle the issues of
the baseline (for instance, the Adaptive LASSO).
(Brownlees) 46/1
Partial Correlation Network
LASSO Computation
(Brownlees) 47/1
Partial Correlation Network
LASSO Computation
Shooting Algorithm
Initialize θ̂L with the least squares estimator.
For k in 1, ..., P, 1, ..., P, 1, ..., P until convergence
1 Define
X
Y(−k) t = Yt − θ̂jL Xj t
j6=k
(Brownlees) 49/1
Partial Correlation Network
Regression Approach
Regression approaches are based on the regression representation
of the series in the panel
X
yi t = θij yj j + ui t , i = 1, . . . , n,
j6=i
2
with Var(ui ) = σ(−i)
and
kij = −θij kii
(Brownlees) 50/1
Partial Correlation Network
(Brownlees) 51/1
Partial Correlation Network
Neighborhood Selection
(Brownlees) 52/1
Partial Correlation Network
(Brownlees) 53/1
Partial Correlation Network
(Brownlees) 54/1
Partial Correlation Network
(Brownlees) 55/1
Partial Correlation Network
(Brownlees) 56/1
Partial Correlation Network
Estimate K by optimizing
X
K
b = arg min
n
tr(ΣK) − log det(K) + λ |kij |
K∈S
i6=j
(Brownlees) 57/1
Partial Correlation Network
(Brownlees) 58/1
Partial Correlation Network
Choosing λ
Fitting a network involves choosing a value of λ
Many prefer using the BIC because it penalizes more and makes
the network more sparse
(Brownlees) 59/1
Partial Correlation Network
(Brownlees) 60/1
Partial Correlation Network
Empirical Illustration
Focus is on estimating the network of idiosyncratic
interconnections of stock returns. We consider a sample of daily
log returns for 93 U.S. bluechips between 2000 and 2013
rt = βrm t + t ∼ N (0, Σ)
Estimation:
1 Estimate ˆt as least squares residuals
2 Estimate the partial correlation network of using SPACE
(Brownlees) 61/1
Partial Correlation Network
T
●
DOW ●
FOXA
●
●
ACN ●
AAPL ●
QCOM ●
AEP CMCSA
●
●
ALL
●
UNH SO
●
●
INTC ●
● IBM TWX
●HPQ
CSCO ● ● CL
●
●
SPG ●
TXN EBAY
DD
●
MET ●
ORCL ●
● ●
BAC EMC ●
●AMZN MSFT● PG ● DIS
●
MDLZ
●
●
AXP ● C
WFC
●
EXC MDT
●
●
GILD BAX● NKE
MO
● ●
USB ●MS
COF ● ●
●
BA
●
GD UTX
●
FDX
●
RTN
●
UPS
●
LMT
●
(Brownlees) 62/1
Partial Correlation Network
(Brownlees) 62/1
Partial Correlation Network
(Brownlees) 62/1
Partial Correlation Network
(Brownlees) 63/1
Networks For Time Series
(Brownlees) 64/1
Networks For Time Series
Proposals:
(Pairwise) Granger Network
Billio, Getmanski, Lo and Pellizzon (2012)
Connectedness Table
Diebold and Yilmaz (2014)
NETS
Barigozzi and Brownlees (2016)
(Brownlees) 65/1
Networks For Time Series
(Brownlees) 66/1
Granger Networks
Granger Networks
Granger Networks
Granger Networks
(Brownlees) 67/1
Granger Networks
Granger Causality
Let x and y be two time series. We say that x does not Granger
cause y if the MSE of a forecast for y based on the past of y and
x has the same MSE of a forecast based on the past of y only:
MSE(E(yt+s |yt , yt−1 , ...)) = MSE(E(yt+s |yt , yt−1 , ..., xt , xt−1 , ...))
(Brownlees) 68/1
Granger Networks
(Brownlees) 69/1
Granger Networks
Granger Networks
(Brownlees) 70/1
Granger Networks
Granger Networks:
Definition
Granger Networks
where
yA t return of firm A on period t
yB t return of firm B on period t
ym t return of the market on period t
(Brownlees) 71/1
Granger Networks
Granger Networks:
Estimation
Granger Networks
Estimation
The model
2
yA t = βA ym t + γA yA t−1 + γAB yB t−1 + A t A t ∼ N (0, σA )
(Brownlees) 72/1
Granger Networks
Granger Networks:
Empirical Illustration
Granger Networks
(Brownlees) 74/1
Granger Networks
(Brownlees) 75/1
Granger Networks
(Brownlees) 75/1
Granger Networks
(Brownlees) 75/1
Granger Networks
Billio et al. (2012) have the merit to be the first ones to introduce
network analysis in the field. Great tool for explanatory analysis.
Some caveats:
Networks change a lot over time!
(Brownlees) 76/1
Connectedness Table
Connectedness Table
Connectedness Table
Connectedness Table
dijH
(Brownlees) 77/1
Connectedness Table
Connectedness Table:
Definition
Connectedness Table
Variance Decomposition
Assume yt has an infinite MA representation
yt = µ + t + Ψ1 t−1 + Ψ2 t−2 + ... Var() = Σ
(Brownlees) 78/1
Connectedness Table
Variance Decomposition
Consider the orthogonlized representation of the shocks t that is
Σ = Var(t )
= E(t 0t ) = E(Au t u 0t A0 ) = A E(u t u 0t ) A0
= a 1 a 01 Var(u1t ) + a 2 a 02 Var(u2t ) + ... + a N a 0N Var(uNt )
(Brownlees) 79/1
Connectedness Table
Variance Decomposition
N h i
Var(uit ) a j a 0j + Ψ1 a j a 0j Ψ01 + Ψ2 a j a 0j Ψ02 + ... + Ψs−1 a j a 0j Ψ0s−1
X
i=1
Thus,
h i
Var(uit ) a j a 0j + Ψ1 a j a 0j Ψ01 + Ψ2 a j a 0j Ψ02 + ... + Ψs−1 a j a 0j Ψ0s−1
(Brownlees) 80/1
Connectedness Table
Variance Decomposition
(Brownlees) 81/1
Connectedness Table
Connectedness Table
(Brownlees) 82/1
Connectedness Table
(Brownlees) 83/1
Connectedness Table
GVD
σjj H (e 0 Ψh Σej )2
P
δijH = PH h=0 i
0 0
h=0 (ei Ψh ΣΨh ei )
and
δijH
dijH = PN
H
j=1 δij
(Brownlees) 84/1
Connectedness Table
Connectedness Table:
Estimation
Connectedness Table
Estimation
(Brownlees) 85/1
Connectedness Table
(Brownlees) 86/1
Connectedness Table
Connectedness Table:
Empirical Illustration
Connectedness Table
Empirical Illustration
(Brownlees) 87/1
Connectedness Table
(Brownlees) 88/1
Connectedness Table
(Brownlees) 89/1
Connectedness Table
(Brownlees) 90/1
Connectedness Table
(Brownlees) 91/1
NETS
NETS
NETS
NETS
(Brownlees) 92/1
NETS
NETS:
Definition
NETS
VAR Approximation
(Brownlees) 93/1
NETS
NETS
1 Granger network
Directed network in which the set of edges EG is such that
2 contemporaneous network
undirected network in which the set of edges EC is such that
(Brownlees) 94/1
NETS
NETS
Network can be characterized in terms of the autoregressive matrices
Ak and the covariance matrix of the VAR innovations Σ
1 Granger network
Directed network in which the set of edges EG is such that
2 contemporaneous network
undirected network in which the set of edges EC is such that
(i, j) ∈ EC ⇔ [Σ−1
]ij 6= 0
(Brownlees) 95/1
NETS
NETS:
Estimation
NETS
Sparse Estimation
(Brownlees) 96/1
NETS
NETS Steps
It can be shown that the loss function for the estimation of the
model parameters can be written as
2
n
T X n n r n r
X X X ckk X c kk
LT (θ) = yit − aij − ρik akj yjt−1 − ρi k ykt
t=1 i=1 j=1 k6=i
cii k6=i
cii
(Brownlees) 97/1
NETS
NETS:
Empirical Application
NETS
Empirical Application
(Brownlees) 98/1
NETS
“Reading” Network
(Brownlees) 99/1
NETS
Volatility Network
●
MCD
TGT
●
●
ABT NKE
●
●
● COF
AXP
JPM
●
● GE
USB ●
VZ
● ●
BK
● C
CAT
●
● AIG
FOXA ●
MET
PEP
●
LLY
●
●
DOW
●
MS●
WFCBAC
●
ALL
BMY
● BRK.B
●
HPQ
● DD
●
F● ●
UNH
GS●
●
T KO
●
● FDX
●
AMGN ●
SPGUNP
●
CMCSA ●
UPS
●
GILD
●
APC
●
LMT ●
TWX
INTC ACN
●
AAPL ●
FCX ●
OXY
APA
●
● ●
● COP ●
EXC
●
EMC ●
ORCL ●
CVX
HON
●
DIS
● ●
CSCO ● NOV
DVN ●
●
TXN ● COST
QCOM ●
SLB
●
●
XOM
SO
●
BA
● ●
AMZN ●
EBAY
●
MON
RTN
●
●
AEP
NSC
●
●
MDLZ CL
●
Granger
(Brownlees) 100/1
NETS
Volatility Network
CMCSA ● DIS
● TWX ●
●
CVS
●
GILD
●
AMGN ●
WAG
●
CL
●
● EXC
SO
●
FDX
●
COST ●
AEP ●
PG
●
AXP
●
WMT ●
UPS
●
TGT
●
HD
●
EBAY
●
AAPL ●
COF
●
GS
●
SBUX ●
AMZN ●
BK
●
LOW
QCOM
●
USB
●
JPM MS
●
PEP
WFC
●
KO ●
INTC
●
CSCO BAC ●
C
●
VZ ●
T ●
TXN
●
ORCL ●
SPG ●
AIG
●
EMC
HPQ
●
●
MET
●
ALL
●
MSFT
●
DD
● CAT
●
DOW
●
FCX XOM ●
● CVX
OXYCOP ●
LMT
●
GD
● ●
● ●●
APC BA ●
RTN
APA
●
MO SLB DVN
NOV ●
UTX
●
MDLZ ●
HAL ●
JNJ
●
ABT
●
PFE ●
HON
●
MRK
● LLY
BMY
●
● UNP
NSC ●
Contemporaneous
(Brownlees) 100/1
NETS
Out–of–sample Validation
(Brownlees) 101/1
NETS
Out–of–sample Validation
(Brownlees) 102/1