Академический Документы
Профессиональный Документы
Культура Документы
and
We note that
Notamos eso
Also
Tambin
P'= In - P. Also,
and
y
(ii) Since X-P is symmetric and idempotent, we have,
by A.6.2
Where
COROLLARY. If X has rank r (r<p), then Theorem 3.1
still holds, but with $ replaced by r.
Proof. Let X be an n x r matrix with r linearly
independent columns and having the same column
space as X [i.e., C(Xd = OJ]. Then P = Xl(X~Xl)-lX~,
and (i) and (ii) follow immediately. We can find a
matrix X such that X = XlL, which implies that (cf.
Exercises 3j, No.2)
which is (iii).
EXERCISES 3a
que es (iii).
EJERCICIOS 3a
Hence, by
The question now arises as to why we chose betha
as our estimate of betha and not some other
estimate. We show below that for a reasonable class
of estimates, betha is the estimate of betha with the
smallest variance. Here betha can be extracted from
betha= (~o, ~1" .. , ~p-l) simply by premultiplying
by the row vector x', which contains unity in the
(j+l)th position and zeros elsewhere. It transpires
that this special property of betha can be
generalized to the case of any linear combination
a'.B using the following theorem.
THEOREM 3.2 Let theta be the least squares
estimate of theta=XBetha, where theta=betha and
X may not have full rank. Then among the class of
linear unbiased estimates of betha is the unique
estimate with minimum variance.
[We say that theta is the best linear unbiased
estimate (BLUE) of c'(J.)]
Proof. From Section 3.1, 0 = PY, where P9 = PX/3 =
X/3 == 9 (Theorem 3.1, Corollary). Hence E[e'O) ==
e'P9 = e'9 for all 9 E n, so that e'O [= (Pe)'Y) is a
linear unbiased estimate of e'9. Let d'Y be any other
linear unbiased estimate of e'9. Then e'9 = E[d'Y) =
d'9 or (e - d)'9 = 0, so that (e - d) 1- n. Therefore,
P(e - d) = 0 and Pe == Pd.
Now
de modo que
So that
EXERCISES 3b
1. Let Yi = (30 + (31 Xi + ci (i = 1,2, ... ,n), where
E[e] = 0 and Var[e] = 0'2In. Find the least squares
estimates of betha and betha. Prove that they are
uncorrelated if and only if x = O.
2. In order to estimate two parameters theta and
it is possible to make observations of
three types: (a) the first type have expectation
theta, (b) the second type have expectation theta +
Adems, E [ "1] = 0 y
dispersion matrix
matriz de dispersin
b~ = a'(B'B)-1B'K1
var [a '(3 *] <var [(K BDZ] = var [b ~ Y].
Thus betha is the unique BLUE of betha. Note that
the ordinary least squares estimate betha will still
be an unbiased estimate of betha, but var[ a' i3] >
var[a' (3*].
EXERCISES 3k
1. Let f3xi + ei (i = 1,2), where &1 '" N(O, 0'2), e2 '"
N(O, 20'2), and e1 and e2 are statistically
independent. If Xl = +1 and X2 = -1, obtain
the weighted least squares estimate of (3 and find
the variance of your estimate.
2. Let Yi (i = 1,2, ... , n) be independent random
variables with a Common mean () and variances
0'21wi (i = 1,2, ... , n). Find the linear unbiased
estimate of () with minimum variance, and find this
minimum variance.
3.Let Y1,Y2 be independent random variables, and
let Yi have a N0'2) distribution for i = 1,2, ... , n. Find
the weighted least squares estimate of theta and
prove that its variance is sigma.
4. Let Y1,Y2 be random variables with common
mean theta and with dispersion matrix 0'2V,where
Vii = 1 (i = 1,2, ... , n) and Vij = Pn(0 < P < 1; i, j =
1,2, ... ,n; i =I- j). Find the generalized least squares
estimate of theta and show that it is the same as
the ordinary least squares estimate. Hint: V-1 takes
the same form as V.
5. Let Y=N(Xf3, 0'2V), where X is n x p of rank p and
V is a known positive-definite n x n matrix. If betha
is the generalized least squares estimate of betha,
prove that
(a) Q = (Y - Xf3*)'V-1(y - Xf3*)10'2 '" X;-p.
(b) Q is the quadratic nonnegative unbiased
estimate of (n - p)0'2 with minimum variance.
(c) If y* = Xf3* := P*Y, then P* is idempotent but
not, in general, symmetric.
6. Suppose that E[Y] = 9, A9 = 0, and Var[Y] = 0'2V,
where A is a q x n matrix of rank q and V is a known
n x n positive-definite matrix. Let 9* be the
generalized least squares estimate of theta; that is,
9* minimizes (Y - 9)'V-1(y - 9) subject to A9:= o.
Show that
Y - 9* := VA''''(*,
EJERCICIOS 3k
1. Que f3xi + ei (i = 1,2), donde Y 1 '"N (O, 0'2), e2'" N (O,
20'2) y E1 y E2 son estadsticamente independientes. Si Xl
= 1 y X 2 = -1, obtener
el de mnimos cuadrados ponderados estimacin de (3 y
encontrar la varianza de la estimacin.
2. Sea Yi (i = 1,2, ..., n) ser variables aleatorias
independientes con una media comn () y varianzas
0'21wi (i = 1,2, ..., n). Encuentra la estimacin imparcial
lineal de () con varianza mnima, y encontrar este
varianza mnima.
3.Deje Y1, Y2 variables aleatorias independientes, y dejar
Yi tiene una distribucin N0'2) para i = 1,2, ..., n.
Encontrar el de mnimos cuadrados ponderados de
estimacin theta y demostrar que su varianza es sigma.
4. Sea Y1, Y2 variables aleatorias con media teta comn y
con matriz de dispersin 0'2V, donde Vii 1 = (i = 1,2, ...,
n) y Vij = Pn (0 <p <1; i , j = 1,2, ..., n, i = j I). Encuentra
la estimacin de mnimos cuadrados generalizados de
theta y mostrar que es la misma que la estimacin de
mnimos cuadrados. Pista: V-1 toma la misma forma como
V.