Академический Документы
Профессиональный Документы
Культура Документы
Remark: MLE has the best asymptotic properties that we may achieve. 2. UMVUE (another type of estimator) UMVUE is one of the most ecient estimators we may achieve. Cramer-Rao Lower Bound (CRLB) Cramer-Rao lower bound provides a lower bound for the variance that may be achieved by an UNBIASED estimator. Its the bottom line for the variance of an unbiased estimator. Let () = T (X1 , ..., Xn ) be any unbiased estimator of (), then V ar() ( ())2 ( ())2 = nIX n E[( ln f (X|))2 ]
Remark: If () = , then it goes back to the simplest case, the numerator becomes 1. An if and only if condition for the equality ln f (xi |) = K(, n)(T (x1 , ..., xn ) )
There are 2 dierent ways for calculating the Fisher information matrix IX = E[( 2 ln f (X|))2 ] = E[ 2 ln f (X|)]
CRLB is one way to CHECK whether an unbiased estimator is a UMVUE. Recall the asymptotic properties of MLE, we may know that MLE asymptotically is a UMVUE. Ways for nding a UMVUE Tools: sucient statistic, complete statistic 1
sucient statistic A statistic T (X) is sucient for if the conditional distribution of X on T (X) does not depend on . That is X|T (X) G(x) and G(x) should be free of . Ways for nding a sucient statistic: By denition, nd out the conditional distribution, then check. By factorization theorem (an if and only if condition !): f (x) = g(T (x), )h(x) minimal sucient statistic A statistic T (X) is called minimal sucient statistic if for any other sucient statistic S(X), T (X) is a function of S(X). complete statistic A statistic T (X) is a complete statistic if it satises the following statement for all , Ways to UMVUE Rao-Blackwell theorem: its only a way to improve the eciency, but NOT necessary to be a UMVUE! Lehmann-Schee theorem: sucient + complete + unbiased UMVUE 3. Remarks on independence necessary and sucient condition FX,Y (x, y) = FX (x)FY (y) X, Y are independent or fX,Y (x, y) = fX (x)fY (y) X, Y are independent ways to show independence ( independent) (a) FX,Y = FX FY or fX,Y = fX fY (b) X1 , ..., Xn independent., Yi = g(Xi ) Y1 , ..., Yn independent. (c) special case for normal distribution: if X1 ,...,Xn follow multivariate normal, and Cov(Xi , Xj ) = 0, then Xi and Xj are independent. properties of independence (independent ) (a) FX,Y = FX FY or fX,Y = fX fY (b) MX+Y (t) = MX (t)MY (t) (c) EXY = EXEY Cov(X, Y ) = E(X EX)(Y EY ) = EXY EXEY = 0 V ar(X + Y ) = V arX + V arY E [g(T )] = 0 P (g(T ) = 0) = 1