Академический Документы
Профессиональный Документы
Культура Документы
Suppose that you have i.i.d observations of size n in the following model:
𝑌 = 𝑋𝛽 + 𝜀
Where E(ε|X) = 0, E(ε’ ε|X) = σ 2 In, X is a n × k matrix, β is a k × 1 vector of unknown
parameters to estimate. Assume that E [X’ε] = w≠ 0 but that you have at hand a n ×
r matrix of instruments Z such that E(Z’ε) = 0 and r > k:
𝛽̂𝑂𝐿𝑆 = (𝑋 ′ 𝑋)−1 𝑋 ′ 𝑌
Entonces en el caso tradicional, aquí tenemos
𝜀´ 𝜀 = (𝑌 − 𝑋𝛽)´ (𝑌 − 𝑋𝛽 )
𝜀´ 𝜀 = 𝑌 ′ 𝑌 − 𝑌 ′ 𝑋𝛽 − (𝑋𝛽)′ 𝑌 + (𝑋𝛽)′ (𝑋𝛽 )
𝜀´ 𝜀 = 𝑌 ′ 𝑌 − 𝑌 ′ 𝑋𝛽 − (𝑋𝛽)′ 𝑌 + 𝛽′ 𝑋 ′ (𝑋𝛽 )
𝜀´ 𝜀 = 𝑌 ′ 𝑌 − 𝑌 ′ 𝑋𝛽 − (𝑋𝛽)′ 𝑌 + 𝛽′ 𝑋 ′ (𝑋𝛽 )
𝜀´ 𝜀 = (𝑋𝛽)′ (𝑋𝛽 + 𝜀) + (𝜀)′ (𝑋𝛽 + 𝜀) − (𝑋𝛽 + 𝜀)′ 𝑋𝛽 − (𝑋𝛽)′ (𝑋𝛽 + 𝜀) + (𝑋𝛽)′ 𝑋(𝑋 ′ 𝑋)−1 𝑋 ′ (𝑋𝛽)
+ (𝑋𝛽)′ 𝑋(𝑋 ′ 𝑋)−1 𝑋 ′ (𝜀) − (𝜀)𝑋(𝑋 ′ 𝑋)−1 𝑋 ′ (𝑋𝛽) − (𝜀)𝑋(𝑋 ′ 𝑋)−1 𝑋 ′ (𝜀)
𝜀´ 𝜀 = (𝜀)′ (𝜀) + (𝑋𝛽)′ 𝑋(𝑋 ′ 𝑋)−1 𝑋 ′ (𝜀) − (𝜀)𝑋(𝑋 ′ 𝑋)−1 𝑋 ′ (𝑋𝛽) − (𝜀)𝑋(𝑋 ′ 𝑋)−1 𝑋 ′ (𝜀)
´
𝜎2
𝐸[𝜀 𝜀] = − 𝐸[(𝜀)′ 𝑋(𝑋 ′ 𝑋)−1 𝑋 ′ (𝜀)]
𝑛
𝜎2
𝐸[𝜀´ 𝜀] = − 𝑤′ (𝑋 ′ 𝑋)−1 𝑤
𝑛
𝛽̂ − 𝛽 = (𝑋 ′ 𝑋)−1 𝑋 ′ (𝜀)
De modo que,
𝑋 ′ 𝑋 −1 𝑋 ′ (𝜀)
𝑝 lim 𝛽̂ − 𝛽 = 𝑝 lim 𝑝 lim ( )
𝑛→∞ 𝑛→∞ 𝑛→∞ 𝑛 𝑛
𝑋 ′ (𝜀)
𝑝 lim 𝛽̂ − 𝛽 = (𝑄)−1 𝑝 lim
𝑛→∞ 𝑛→∞ 𝑛
𝑋 ′ (𝜀)
Sin embargo en este momento (𝑄)−1 𝑦 existen y no son nulos, y por tanto
𝑛
llegamos a:
𝑋 ′ (𝜀)
𝑝 lim 𝛽̂ − 𝛽 = (𝑄)−1 𝑝 lim =0
𝑛→∞ 𝑛→∞ 𝑛
b) F Find the GMM (IV) estimator and show that it is consistent estimator of the
parameter β. It is the GMM estimator of β unbiased?
Respuesta
Sabemos que en GMM tenemos que
′
1 1
𝛽̂𝐺𝑀𝑀 = ( ∑ 𝛾𝑖 (𝑤, 𝛽)) 𝐴𝑛 ( ∑ 𝛾𝑖 (𝑤, 𝛽))
𝑛 𝑛
Aquí sabemos que
𝑋 = 𝑍𝛽+ 𝜗
De modo que
̂
̂ +𝜗
𝑋 = 𝑍 ̂𝛽 + 𝜗
̂=𝑋 ̂
̂
̂ = 𝑍(𝑍′ 𝑍)−1 𝑍′ 𝑋
𝑋
̂ ̂
̂ ′𝑋
̂ = 𝑋 ′ 𝑍𝑍′ 𝑋
𝑋
Finalmente, la regresión seria
̂
̂ ′ 𝛽̂𝐺𝑀𝑀 + 𝜀̂
𝑌=𝑋
En el caso tradicional, aquí tenemos
−1
̂ ̂
̂ ′𝑋 ̂
̂) 𝑋
̂ ′𝑌 ]
𝛽̂𝐺𝑀𝑀 = 𝐸 [(𝑋
−1
̂ ̂
̂ ′𝑋 ̂
̂) 𝑋
̂ ′ (𝑋𝛽 + 𝜀)]
𝛽̂𝐺𝑀𝑀 = 𝐸 [(𝑋
−1
̂ ̂
̂ ′𝑋 ̂
̂) 𝑋
̂ ′ (𝑋𝛽 + 𝜀)]
𝛽̂𝐺𝑀𝑀 = 𝐸 [(𝑋
−1 −1
̂ ̂
̂ ′𝑋 ̂
̂) 𝑋 ̂
̂ ′ 𝑋𝛽 + (𝑋 ̂
̂ ′𝑋 ̂
̂) 𝑋
̂ ′ 𝜀]
𝛽̂𝐺𝑀𝑀 = 𝐸 [(𝑋
Remplazando llegamos a
−1 −1
𝛽̂𝐺𝑀𝑀 = 𝐸 [(𝑋 ′ 𝑍(𝑍′ 𝑍)−1 𝑍′ 𝑋) 𝑋 ′ 𝑍(𝑍′ 𝑍)−1 𝑍′ 𝑋𝛽 + (𝑋 ′ 𝑍(𝑍′ 𝑍)−1 𝑍′ 𝑋) 𝑋 ′ 𝑍(𝑍′ 𝑍)−1 𝑍′ 𝜀]
−1
𝛽̂𝐺𝑀𝑀 = 𝛽 + 𝐸 [(𝑋 ′ 𝑍(𝑍′ 𝑍)−1 𝑍′ 𝑋) 𝑋 ′ 𝑍(𝑍′ 𝑍)−1 𝑍′ 𝜀]
De modo que,
−1
𝑝 lim 𝛽̂𝐺𝑀𝑀 − 𝛽 = 𝑝 lim (𝑋 ′ 𝑍(𝑍′ 𝑍)−1 𝑍′ 𝑋) 𝑋 ′ 𝑍(𝑍′ 𝑍)−1 𝐸[𝑍′ 𝜀]
𝑛→∞ 𝑛→∞
𝑛 ′ −1
𝑝 lim 𝛽̂𝐺𝑀𝑀 − 𝛽 = 𝑝 lim (𝑋 𝑍(𝑍′ 𝑍)−1 𝑍′ 𝑋) 𝑋 ′ 𝑍(𝑍′ 𝑍)−1 𝐸[𝑍′ 𝜀]
𝑛→∞ 𝑛→∞ 𝑛
−1
𝑋 ′ 𝑍(𝑍′ 𝑍)−1 𝑍′ 𝑋 𝑋 ′ 𝑍 ′ −1
𝑝 lim 𝛽̂𝐺𝑀𝑀 − 𝛽 = 𝑝 lim ( ) (𝑍 𝑍) 𝐸[𝑍′ 𝜀]
𝑛→∞ 𝑛→∞ 𝑛 𝑛
Sin embargo en este momento término converge a la matriz de varianza asintótica
̂𝑜
𝑊
ponderada por la varianza que existe y es no nulos, por tanto llegamos a:
𝜎2
̂𝑜
𝑊 𝑋 ′ 𝑍 ′ −1
𝑝 lim 𝛽̂𝐺𝑀𝑀 − 𝛽 = 𝑝 lim (𝑍 𝑍) 𝐸[𝑍′ 𝜀]
𝑛→∞ 𝜎2 𝑛→∞ 𝑛
′
̂ 𝑜 que existe y es no
Ahora, el segundo término converge al negativo de la matriz 𝐷
nula, llegando así a:
̂𝑜
𝑊 𝑋 ′ 𝑍 ′ −1
𝑝 lim 𝛽̂𝐺𝑀𝑀 − 𝛽 = 𝑝 lim (𝑍 𝑍) 𝐸[𝑍′ 𝜀]
𝑛→∞ 𝜎2 𝑛→∞ 𝑛
̂𝑜
𝑊 ′
𝑝 lim 𝛽̂𝐺𝑀𝑀 − 𝛽 = ̂ 𝑜 ) (𝑍′ 𝑍)−1 𝐸[𝑍′ 𝜀]
(−𝐷
𝑛→∞ 𝜎2
𝑝 lim 𝛽̂ − 𝛽 = 0
𝑛→∞
c) Obtain A0, D0 and V0 and their sample counterparts 𝐴̂𝑁 , 𝐷
̂ and 𝑉
̂ matrices
considering both heteroskedastic and homoskedastic errors.
Respuesta
El problema ahora es
1 1
𝑆𝑛 (𝛽) = (𝑌 − 𝑋𝛽)′ 𝑍𝐴̂𝑁 𝑍′ (𝑌 − 𝑋𝛽)
𝑛 𝑛
Así podemos hallar las condiciones de primer orden para 𝑆𝑛 (𝛽)
𝜕𝑆𝑛 (𝛽) 1 1
= ((𝑋 ′ 𝑍)𝐴̂𝑁 𝑍′ (𝑌 − 𝑋𝛽̂𝐺𝑀𝑀 )) = 0
𝜕𝛽 𝑛 𝑛
(𝑋 ′ 𝑍)𝐴̂𝑁 𝑍′ (𝑌 − 𝑋𝛽̂𝐺𝑀𝑀 ) = 0
Bajo homocedasticidad
Tenemos que
𝑍′ 𝜀𝜀′ 𝑍 𝑍′ 𝑍
𝑽̂𝒐 = 𝐸 ( ) = 𝜎 𝐸(
2 )
𝑛 𝑛
̂ 𝑜 en 𝐴̂𝑁
Así remplazamos 𝑉
−1
𝐴̂𝑁 = 𝑘𝑉
̂𝑁
𝑍′ 𝜀𝜀′ 𝑍′ −1 1 𝑍′ 𝑍′ −1
𝐴̂𝑁 = 𝑘 ( ) = 𝑘 2( )
𝑛 𝜎 𝑛
Si 𝑘 = 𝜎2 , tenemos que
𝑍′ 𝑍′ −1
𝑨̂𝒐 = ( )
𝑛
−1
𝛽̂𝑂𝐼𝑉 = ((𝑋 ′ 𝑍)(𝑍′ 𝑍′ )−1 𝑍′ 𝑋) (𝑋 ′ 𝑍)(𝑍′ 𝑍′ )−1 𝑍′ 𝑌 = 𝛽̂𝑂𝐼𝑉
̂ 𝑜 , sabemos que
Ahora en nos queda analizar 𝐷
1
𝜕 (𝑛 ∑ 𝛾𝑖 (𝑤, 𝛽))
̂𝑜 = 𝐸
𝐷
( 𝜕𝛽 )
1
𝜕(𝑛 𝑍′ (𝑌 − 𝑋𝛽) 1 𝜕(𝑍′ (𝑌 − 𝑋𝛽) 1 1 ′
̂𝑜 = 𝐸
𝐷 = 𝐸( ′
) = 𝐸(−𝑍 𝑋) = − 𝑍 𝑋
( 𝜕𝛽 ) 𝑛 𝜕𝛽 𝑛 𝑛
̂𝑜
Finalmente, la matriz de varianza asintótica tenemos que 𝑊
′ −1 −1
̂ 𝑜 = (𝐷
𝑊 ̂𝑜 𝑉
̂𝑜 𝐷
̂ 𝑜)
−1
−1
̂ 1 ′ ′ 2 𝑍′ 𝑍 1
𝑊𝑜 = (− 𝑍 𝑋) (𝜎 ( )) (− 𝑍′ 𝑋)
( 𝑛 𝑛 𝑛 )
−1
̂ 1 ′ ′
−1 1
′
𝑊𝑜 = ( 𝑋 𝑍(𝑛𝜎 (𝑍 𝑍))
2
𝑍 𝑋)
𝑛 𝑛
2 −1
̂ 𝑜 = 𝑛 𝜎 (𝑋 ′ 𝑍(𝑍′ 𝑍)−1 𝑍′ 𝑋)
𝑊 2
𝑛
−1
̂ 𝑜 = 𝑛𝜎2 (𝑋 ′ 𝑍(𝑍′ 𝑍)−1 𝑍′ 𝑋)
𝑊
̂𝑜
𝑊 −1
= 𝜎2 (𝑋 ′ 𝑍(𝑍′ 𝑍)−1 𝑍′ 𝑋)
𝑛
Bajo heterocedasticidad
En el caso en el que se llegara a presentar sabemos que:
𝐸(𝜀𝜀′ ) = 𝜎2 Ω
Si 𝑘 = 𝜎2 , tenemos que
−1
𝑍′ Ω−1 𝑍
𝑨̂𝒐 = ( )
𝑛
Así remplazamos en el estimador,
−1
𝛽̂𝐺𝑀𝑀 = ((𝑋 ′ 𝑍)𝐴̂𝑁 𝑍′ 𝑋) (𝑋 ′ 𝑍)𝐴̂𝑁 𝑍′ 𝑌
−1
′ −1 −1
𝑍Ω 𝑍 −1
𝑍′ Ω−1 𝑍 ∗∗∗
𝛽̂𝐺𝑀𝑀 ′
= (𝑋 𝑍) ( ′ ′
) 𝑍 𝑋 (𝑋 𝑍) ( ) 𝑍 𝑌 = 𝛽̂𝑂𝐼𝑉
′
( 𝑛 ) 𝑛
−1
−1 −1 ∗∗∗
𝛽̂𝑂𝐼𝑉 = ((𝑋 ′ 𝑍)(𝑍′ Ω−1 𝑍) 𝑍′ 𝑋 ) (𝑋 ′ 𝑍)(𝑍′ Ω−1 𝑍) 𝑍′ 𝑌 = 𝛽̂𝑂𝐼𝑉
̂ 𝑜 , sabemos que
Ahora en nos queda analizar 𝐷
1
𝜕 (𝑛 ∑ 𝛾𝑖 (𝑤, 𝛽))
̂𝑜 = 𝐸
𝐷
( 𝜕𝛽 )
1
𝜕(𝑛 𝑍′ (𝑌 − 𝑋𝛽) 1 𝜕(𝑍′ (𝑌 − 𝑋𝛽) 1 1 ′
̂𝑜 = 𝐸
𝐷 = 𝐸( ′
) = 𝐸(−𝑍 𝑋) = − 𝑍 𝑋
( 𝜕𝛽 ) 𝑛 𝜕𝛽 𝑛 𝑛
̂𝑜
Finalmente, la matriz de varianza asintótica tenemos que 𝑊
′ −1 −1
̂ 𝑜 = (𝐷
𝑊 ̂𝑜 𝑉
̂𝑜 𝐷
̂ 𝑜)
−1
−1
̂ 1 ′ ′ 2 𝑍′ 𝑍 1
𝑊𝑜 = (− 𝑍 𝑋) (𝜎 ( )) (− 𝑍′ 𝑋)
( 𝑛 𝑛 𝑛 )
−1
̂ 1 ′ ′
−1 1
′
𝑊𝑜 = ( 𝑋 𝑍(𝑛𝜎 (𝑍 𝑍))
2
𝑍 𝑋)
𝑛 𝑛
2 −1
̂ 𝑜 = 𝑛2 𝜎 (𝑋 ′ 𝑍(𝑍′ 𝑍)−1 𝑍′ 𝑋)
𝑊
𝑛
−1
̂ 𝑜 = 𝑛𝜎2 (𝑋 ′ 𝑍(𝑍′ 𝑍)−1 𝑍′ 𝑋)
𝑊
̂𝑜
𝑊 −1
= 𝜎2 (𝑋 ′ 𝑍(𝑍′ 𝑍)−1 𝑍′ 𝑋)
𝑛
∗∗∗
Para corregir el problema en 𝛽̂𝑂𝐼𝑉 podemos redefinir el problema
d) Find the GMM (IV) estimator and show that it is consistent estimator
of the parameter.
Respuesta
Para encontrar el parámetro de GMM (IV) resolvemos el siguiente problema:
′
1 1
̂𝛽𝐺𝑀𝑀 = ( ∑ 𝑧𝑖 𝜀𝑖 ) 𝐴𝑛 ( ∑ 𝑧𝑖 𝜀𝑖 )
𝑛 𝑛
1 1
𝑆𝑛 (𝛽) = (𝑍′ 𝜀)′ 𝐴̂𝑁 (𝑍′ 𝜀)
𝑛 𝑛
1 ′ 1
𝑆𝑛 (𝛽) = (𝑍′ (𝑌 − 𝑋𝛽)) 𝐴̂𝑁 (𝑍′ (𝑌 − 𝑋𝛽))
𝑛 𝑛
(𝑋 ′ 𝑍)𝐴̂𝑁 𝑍′ (𝑌 − 𝑋𝛽̂𝐺𝑀𝑀 ) = 0
𝑍′ 𝑍′ −1
𝑨̂𝒐 = ( )
𝑛
−1
𝛽̂𝐺𝑀𝑀 = 𝐸 [(𝑋 ′ 𝑍(𝑍′ 𝑍)−1 𝑍′ 𝑋) 𝑋 ′ 𝑍(𝑍′ 𝑍)−1 𝑍′ 𝑌 ]
−1
𝑍′ 𝑍′ −1 ′ 𝑍′ 𝑍′ −1 ′
𝛽̂𝐺𝑀𝑀 ′
= (𝑋 𝑍) ( ′
) 𝑍 𝑋 (𝑋 𝑍) ( ) 𝑍 𝑌 = 𝛽̂𝑂𝐼𝑉
( 𝑛 ) 𝑛
Sabiendo que
𝑌 = 𝑋𝛽 + 𝜀
Obtenemos
−1 −1
𝛽̂𝑂𝐼𝑉 = 𝐸 [(𝑋 ′ 𝑍(𝑍′ 𝑍)−1 𝑍′ 𝑋) 𝑋 ′ 𝑍(𝑍′ 𝑍)−1 𝑍′ 𝑋𝛽 + (𝑋 ′ 𝑍(𝑍′ 𝑍)−1 𝑍′ 𝑋) 𝑋 ′ 𝑍(𝑍′ 𝑍)−1 𝑍′ 𝜀]
−1 −1
𝛽̂𝑂𝐼𝑉 = 𝐸 [(𝑋 ′ 𝑍(𝑍′ 𝑍)−1 𝑍′ 𝑋) 𝑋 ′ 𝑍(𝑍′ 𝑍)−1 𝑍′ 𝑋𝛽] + 𝐸 [(𝑋 ′ 𝑍(𝑍′ 𝑍)−1 𝑍′ 𝑋) 𝑋 ′ 𝑍(𝑍′ 𝑍)−1 𝑍′ 𝜀]
−1
𝛽̂𝑂𝐼𝑉 = 𝐸[𝛽] + 𝐸 [(𝑋 ′ 𝑍(𝑍′ 𝑍)−1 𝑍′ 𝑋) 𝑋 ′ 𝑍(𝑍′ 𝑍)−1 𝑍′ 𝜀]
−1
𝛽̂𝑂𝐼𝑉 = 𝛽 + (𝑋 ′ 𝑍(𝑍′ 𝑍)−1 𝑍′ 𝑋) 𝑋 ′ 𝑍(𝑍′ 𝑍)−1 𝐸[𝑍′ 𝜀]
De modo que
𝛽̂𝑂𝐼𝑉 = 𝛽
Π = 𝑋 ′ 𝑍(𝑍′ 𝑍)−1
Llegamos a
1 ′
(Y − X𝛽̂2𝑠𝑙𝑠 ) (Y − X𝛽̂2𝑠𝑙𝑠 )
𝑛
𝐴𝑜 = 𝑘𝑉𝑜 −1
Con K>0
Respuesta
̂𝑜
Como vimos anteriormente, la matriz de varianza asintótica tenemos que 𝑊
̂ 𝑜 = (𝐷′ 𝑜 𝐴𝑜 𝐷𝑜 ) −1 𝐷′ 𝑜 𝐴𝑜 𝑉𝑜 𝐴𝑜 𝐷𝑜 (𝐷′ 𝑜 𝐴𝑜 𝐷𝑜 ) −1
𝑊
De la prueba que hicimos en c tenemos que:
𝑍′ 𝜀𝜀′ 𝑍 𝑍′ 𝑍
𝑽̂𝒐 = 𝐸 ( ) = 𝜎 2 (
𝐸 )
𝑛 𝑛
1
̂𝒐 = −
𝑫 𝐸(𝑍′ 𝑋)
𝑛
𝑍′ 𝑍′ −1
𝑨̂𝒐 = ( )
𝑛
̂𝑜
Asi remplazamos en 𝑊
̂ 𝑜 = (𝐷′ 𝑜 𝐴𝑜 𝐷𝑜 ) −1 𝐷′ 𝑜 𝐴𝑜 𝑉𝑜 𝐴𝑜 𝐷𝑜 (𝐷′ 𝑜 𝐴𝑜 𝐷𝑜 ) −1
𝑊
′ ′ ′ −1 ′ ′ ′ −1 ′ ′ ′ −1 ′ ′ ′ −1
̂ 𝑜 = ((− 1 (𝑍′ 𝑋)) (𝑍 𝑍 ) (− 1 (𝑍′ 𝑋))) −1 (− 1 (𝑍′ 𝑋)) (𝑍 𝑍 ) (𝜎2 (𝑍 𝑍)) (𝑍 𝑍 ) (− 1 (𝑍′ 𝑋)) ((− 1 (𝑍′ 𝑋)) (𝑍 𝑍 ) (− 1 (𝑍′ 𝑋))) −1
𝑊
𝑛 𝑛 𝑛 𝑛 𝑛 𝑛 𝑛 𝑛 𝑛 𝑛 𝑛
1 ′ 𝑍′ 𝑍 −1 ′ 1 ′ 𝑍′ 𝑍 −1 𝑍′ 𝑍 𝑍′ 𝑍 −1 1 ′ 1 ′ 𝑍′ 𝑍 −1 ′
̂ 𝑜 = 𝜎2 (
𝑊 𝑋 𝑍 ( ) 𝑍 𝑋 )
−1
𝑋 𝑍 ( ) ( ) (− 𝑍 𝑋) ( 2 𝑋 𝑍( ) 𝑍 𝑋 ) −1
𝑛2 𝑛 𝑛 𝑛 𝑛 𝑛 𝑛 𝑛 𝑛
𝑛2 𝑛2 ′ 𝑍′ 𝑍 −1 ′ 𝑍′ 𝑍′ −1 𝑍′ 𝑍 −1 ′
̂ 𝑜 = 𝜎2
𝑊 ( 𝑋 𝑍 ( ) 𝑍 𝑋 )
−1
𝑋 ′ (
𝑍 ) 𝑰(𝑍′
𝑋) ( 𝑋 ′ (
𝑍 ) 𝑍 𝑋 ) −1
𝑛2 𝑛 𝑛 𝑛
𝑍′ 𝑍 −1 ′ 𝑍′ 𝑍 −1 𝑍′ 𝑍 −1 ′
̂ 𝑜 = 𝜎2 𝑛2 (𝑋 ′ 𝑍 (
𝑊 ) 𝑍 𝑋 ) −1 𝑋 ′ 𝑍 ( ) 𝐼(𝑍′ 𝑋) (𝑋 ′ 𝑍 ( ) 𝑍 𝑋 ) −1
𝑛 𝑛 𝑛
1 𝑍′ 𝑍 −1
̂ 𝑜 = 𝜎2 𝑛2
𝑊 (𝑋 ′
𝑍(𝑍′ −1 ′
𝑍) 𝑍 𝑋) −1
𝑋 ′ (
𝑍 ) 𝐼(𝑍′ 𝑋)(𝑋 ′ 𝑍(𝑍′ 𝑍)−1 𝑍′ 𝑋) −1
𝑛2 𝑛
𝑍′ 𝑍′ −1
̂ 𝑜 = 𝜎2 (𝑋 ′ 𝑍(𝑍′ 𝑍)−1 𝑍′ 𝑋) −1 𝑋 ′ 𝑍 (
𝑊 ) 𝐼(𝑍′ 𝑋)(𝑋 ′ 𝑍(𝑍′ 𝑍)−1 𝑍′ 𝑋) −1
𝑛
̂ 𝑜 = 𝜎2 𝑛(𝑋 ′ 𝑍(𝑍′ 𝑍)−1 𝑍′ 𝑋) −1 𝑋 ′ 𝑍(𝑍′ 𝑍)−1 (𝑍′ 𝑋)(𝑋 ′ 𝑍(𝑍′ 𝑍)−1 𝑍′ 𝑋) −1
𝑊
Π = 𝑋 ′ 𝑍(𝑍′ 𝑍)−1
̂𝑜
De modo que tendríamos al reescribrir 𝑊
̂𝑜
𝑊
= 𝜎2 (Π𝑍′ 𝑍Π ′ ) −1
𝑛
̂
̂
Porque ahora definimos nuestras 𝑋 como
̂ = 𝑍Π
𝑋 ̂′
g) Write the Hausman's statistic test involving both 2SLS and OLS.
Respuesta
El test de Hausman consta de las siguientes hipótesis
𝐻0 = 𝐶𝑜𝑣(𝑌̃ , 𝜀) = 0
𝐻1 = 𝐶𝑜𝑣(𝑌̃ , 𝜀) ≠ 0
𝑑 ̂ = 𝑣𝑎𝑟(𝛽̂2𝑠𝑙𝑠 ) − 𝑣𝑎𝑟(𝛽̂𝑜𝑙𝑠 )
De modo que remplazamos ahora los
𝑑 ̂ = 𝜎2 (𝑋 ′ 𝑍(𝑍′ 𝑍)−1 𝑍′ 𝑋) −1 − (𝑋 ′ 𝑋) −1
𝜎2 𝑋 ′ 𝑍(𝑍′ 𝑍)−1 𝑍′ 𝑋 −1 𝑋 ′ 𝑋 −1
𝑑̂ = [( ) −( ) ]
𝑛 𝑛 𝑛
Así hacemos la matriz proyección
𝐼 − 𝑀𝑧 = 𝑍(𝑍′ 𝑍)−1 𝑍′
Por lo que al reescribir llegamos a
𝜎2 𝑋 ′ [𝐼 − 𝑀𝑧 ]𝑋 −1 𝑋 ′ 𝑋 −1
𝑑̂ = [( ) −( ) ]
𝑛 𝑛 𝑛
𝜎2 𝑋 ′ 𝑋 𝑋 ′ 𝑀𝑧 𝑋 −1 𝑋 ′ 𝑋 −1
𝑑̂ = [( − ) −( ) ]
𝑛 𝑛 𝑛 𝑛
Asi analizamos lo que sucederá con la convergencia (límite) en probabilidad para d
̂ 𝜎2 𝑋 ′ 𝑋 𝑋 ′ 𝑀𝑧 𝑋 −1 𝑋 ′ 𝑋 −1
𝑝 lim 𝑑 = 𝑝 lim [ ( − ) − ( ) ]
𝑛→∞ 𝑛→∞ 𝑛 𝑛 𝑛 𝑛
𝜎2 𝑋 ′ 𝑋 𝑋 ′ 𝑀𝑧 𝑋 −1 𝑋 ′ 𝑋 −1
𝑝 lim 𝑑 ̂ = 𝑝 lim (𝑝 lim [( − ) ] − 𝑝 lim [( ) ])
𝑛→∞ 𝑛→∞ 𝑛 𝑛→∞ 𝑛 𝑛 𝑛→∞ 𝑛
𝜎2 𝑋 ′𝑋 𝑋 ′ 𝑀𝑧 𝑋 −1 𝑋 ′ 𝑋 −1
𝑝 lim 𝑑 ̂ = 𝑝 lim ([(𝑝 lim − 𝑝 lim ) ] − 𝑝 lim [( ) ])
𝑛→∞ 𝑛→∞ 𝑛 𝑛→∞ 𝑛 𝑛→∞ 𝑛 𝑛→∞ 𝑛
Asi por la teoria asintotica sabemos que
𝜎2 𝑋 ′ 𝑀𝑧 𝑋 −1
𝑝 lim 𝑑 ̂ = 𝑝 lim ([(𝑄 − 𝑝 lim ) ] − (𝑄)−1 )
𝑛→∞ 𝑛→∞ 𝑛 𝑛→∞ 𝑛
𝜎2 𝑋 ′ 𝑀𝑧 𝑋 −1
𝑝 lim 𝑑 ̂ = 𝑝 lim ([(𝑄 − 𝑝 lim ) ] − (𝑄)−1 )
𝑛→∞ 𝑛→∞ 𝑛 𝑛→∞ 𝑛
𝑉𝑎𝑟(𝛽̂2𝑠𝑙𝑠 ) ≥ 𝑉𝑎𝑟(𝛽̂𝑜𝑙𝑠 )
De este modo el estadístico de prueba del test de Hausman es
′ −1
𝐻 = (𝛽̂2𝑠𝑙𝑠 − 𝛽̂𝑜𝑙𝑠 ) (𝑣𝑎𝑟(𝛽̂2𝑠𝑙𝑠 − 𝛽̂𝑜𝑙𝑠 )) (𝛽̂2𝑠𝑙𝑠 − 𝛽̂𝑜𝑙𝑠 )
Finalmente,
1 ̂ ′ ̂ −1 ̂ ′ ′ −1
̂ ′𝑋
′ 𝑋) −1 𝑋 ′ 𝑌) ([(𝑋 ′ [𝐼 − 𝑀 ]𝑋) −1 − (𝑋 ′ 𝑋) −1 ]) ((𝑋 ̂ ) −1 𝑋
̂ ′𝑌
𝐻= ((𝑋 𝑋 ) 𝑋 𝑌 − (𝑋 𝑧
𝜎2
− (𝑋 ′ 𝑋) −1 𝑋 ′ 𝑌)