Вы находитесь на странице: 1из 4

CALIFORNIA INSTITUTE OF TECHNOLOGY

Ma 2b KC Border
Introduction to Probability and Statistics February 2013
Notes on The FrchetCramrRao Lower Bound
The Larsen and Marx textbook states the CramrRao Lower Bound [7, Theorem 5.5.1,
p. 320], but does not derive it. In this note I present a slight generalization of their statement.
The argument is essentially that of B. L. van der Waerden [9, pp. 160162], who points out
that Maurice Frchet [5] seems to have beaten Harald Cramr [3],[4, 32.332.8, pp. 477497,
esp. p. 480] and C. Radakrishna Rao [8] by a couple of years.
The FCR result puts a lower bound on the variance of estimators. Let X
1
, . . . , X
n
be
independent and identically distributed random variables with parametric density function
f(x, ). The joint density f
n
at x = (x
1
, . . . , x
n
) is given by
f
n
(x; ) = f(x
1
, )f(x
2
, ) f(x
n
, ).
This is also the likelihood function for .
A statistic is random variable T that is a function of X
1
, . . . , X
n
, say
T = T(X
1
, . . . , X
n
).
The expectation of T is the multiple integral
E

T =

T(x)f
n
(x; ) dx
and it depends on the unknown parameter . The variance of T is given by
Var

T = E

_
T E

T
_
2
.
We say that T is a unbiased estimator of if for each
E

T = .
More generally, dene the bias of T as
b() = E

T .
1 Theorem (FrchetCramrRao) Assume f is continuously dierentiable and assume
that the support {x : f(x; ) > 0} does not depend on . Let T be an estimator of . Then
Var

T is bounded below, and:


Var

T
_
1 +b

()

2
nE

_
_

log f(X; )
_
2
_.
1
Ma 2b February 2013
KC Border FrchetCramrRao Lower Bound 2
Proof : By denition of the bias,
+b() = E

T =

T(x)f
n
(x; ) dx (1)
Let f

n
(x; ) indicate the partial derivative with respect to . Dierentiate both sides of (1)
to get (dierentiating under the integral sign):
1 +b

() =

T(x)f

n
(x; ) dx
=

T(x)
f

n
(x; )
f
n
(x; )
f
n
(x; ) dx. (2)
Notice that the last term is an expected value. Let L denote the log-likelihood,
L(x; ) = log f
n
(x; ),
and observe that
f

n
(x; )
f
n
(x; )
= L

(x; ).
Okay, so now we can rewrite (2) as
1 +b

() = E

_
T(x)L

(x; )

. (3)
Take the fact that
1 =

f
n
(x; ) dx,
and dierentiate both sides to get
0 =

n
(x; ) dx = 0
=

(x; )
f(x; )
f(x; ) dx
= E

(x; ). (4)
Multiply both sides of this by E

T and subtract it from (3) to get


1 +b

() = E

__
T(x) E

T
_
L

(x; )

. (5)
The right-hand side is the expectation of a product, so we can use the Schwarz Inequality
(Lemma 2) below to get a bound on it. Square both sides of (5) to get
_
1 +b

()
_
2
=
_
E

__
T(x) E

T
_
L

(x; )
_
2
E

_
T E

T
_
2
. .
=Var

T
E

(L
2
).
Rearranging this gives
Var

T
_
1 +b

()

2
E

_
_

log f
n
(x; )
_
2
_. (6)
Ma 2b February 2013
KC Border FrchetCramrRao Lower Bound 3
The joint density f
n
is a product, so

log f
n
(x; ) =
n

i=1

log f(x
i
; ) (7)
Now the same argument as in (4) shows that E

log f(X
i
; ) = 0, so (7) is a sum of n
independent mean zero variables. Thus its variance is just n times the expected square of any
one of them. That is, (6) can be rewritten as
Var

T
_
1 +b

()

2
nE

_
_

log f(X; )
_
2
_.
When the bias is always zero, then b

() = 0, and this reduces to Theorem 5.5.1 in Larsen


and Marx [7].
I did play fast and loose with some of the math. In particular, I assumed I could dierentiate
under the integral sign, and I assumed that the bias was dierentiable. See [2] for when this is
permissible. I also assumed that the denominator above was nonzero.
A Appendix
You know this result, but the proof van der Waerden gave was so pretty, I reproduced it here.
2 Lemma (Schwarz Inequality) If Y and Z are random variables with nite second
moments, then
(EY Z)
2
(EY
2
)(EZ
2
).
Proof : (van der Waerden [9, p. 161]) The quadratic form in (a, b) dened by
E(aY +bZ)
2
= (EY
2
)a
2
+ 2(EY Z)ab + (EZ
2
)b
2
is positive semidenite, so its determinant is nonnegative (see, e.g., [1]). That is,
(EY
2
)(EZ
2
) (EY Z)
2
0.
References
[1] K. C. Border. 2001. More than you wanted to know about quadratic forms. On-line
note. http://www.hss.caltech.edu/~kcb/Notes/QuadraticForms.pdf
[2] . 2013. Dierentiating an integral. On-line note.
http://www.hss.caltech.edu/~kcb/Notes/DifferentiatingAnIntegral.pdf
Ma 2b February 2013
KC Border FrchetCramrRao Lower Bound 4
[3] H. Cramr. 1946. A contribution to the theory of statistical estimation. Skandinavisk
Aktuarietidskrift 29:8594.
[4] . 1946. Mathematical methods of statistics. Number 34 in Princeton Mathematical
Series. Princeton, New Jersey: Princeton University Press. Reprinted 1974.
[5] M. Frchet. 1943. Sur lextension de certaines evaluations statistiques au cas de petits
echantillons. Revue de lInstitut International de Statistique / Review of the International
Statistical Institute 11(3/4):182205. http://www.jstor.org/stable/1401114
[6] J. L. Hodges, Jr. and E. L. Lehmann. 1951. Some applications of the CramrRao inequal-
ity. In J. Neyman, ed., Proceedings of the Second Berkeley Symposium on Mathematical
Statistics and Probability II, Part I, pages 1322, Berkeley. University of California Press.
http://projecteuclid.org/euclid.bsmsp/1200500213
[7] R. J. Larsen and M. L. Marx. 2012. An introduction to mathematical statistics and its
applications, fth ed. Boston: Prentice Hall.
[8] C. R. Rao. 1945. Information and the accuracy attainable in the estimation of statistical
parameters. Bulletin of the Calcutta Mathematical Society 37(3):8191.
http://bulletin.calmathsoc.org/article.php?ID=B.1945.37.14
[9] B. L. van der Waerden. 1969. Mathematical statisitcs. Number 156 in Grundlehren der
mathematischen Wissenschaften in Einzeldarstellungen mit besonderer Bercksichtigung
der Anwendungsgebiete. New York, Berlin, and Heidelberg: SpringerVerlag. Translated
by Virginia Thompson and Ellen Sherman from Mathematische Statistik, published by
Springer-Verlag in 1965, as volume 87 in the series Grundlerhen der mathematischen Wis-
senschaften.
[10] J. Wolfowitz. 1947. The eciency of sequential estimates and Walds equation for sequen-
tial processes. Annals of Mathematical Statistics 18(2):215230.
http://www.jstor.org/stable/2235780

Вам также может понравиться