Вы находитесь на странице: 1из 3

National University of Singapore

ST5215: Advanced Statistical Theory (I)

Semester 1: AY 2016-2017

Time Allowed: 2 Hours

INSTRUCTIONS TO CANDIDATES

1. Please write your matriculation number only. Do not write your name.

2. This examination paper contains FOUR (4) questions and comprises THREE (3) printed
pages.

3. Candidates must answer ALL questions. Each question carries 25 marks. The total mark
for this paper is 100.

4. Please start each question on a new page.

5. This is a CLOSED BOOK examination. But one A4 size help sheet is allowed.

6. Any electronic devices such as calculator, hand phone, iPad and PC are not allowed in
the examination.
PAGE 2 ST5215

1. Let X1 , X2 , . . . be a sequence of independent random variables that only take three values
{−1, 0, 1} and satisfy that

1 1 1
P(Xn = 1) = , P(Xn = −1) = , P(Xn = 0) = 1 − ,
2nθ 2nθ nθ
for n = 1, 2, . . . and a constant θ > 0.

(i) Consider the statement “Xn → 0 almost surely as n → ∞”. Find the smallest constant
c > 0, such that this statement is true if θ > c. To justify your answer, show that the statement
could be false if 0 < θ ≤ c.

(ii) If 0 < θ ≤ 1, show that the Lindeberg’s condition holds for the sequence, i.e.

1 n 
E (Xi − EXi )2 I{|Xi −EXi |>εσn } = 0

lim 2 ∑ for any ε > 0,
n→∞ σn
i=1

where σn2 = ∑ni=1 Var(Xi ). State the central limit theorem for the sequence.

(iii) Suppose 0 < θ ≤ 1. Let X n = n−1 ∑ni=1 Xi . Let Φ(·) be the standard normal cumulative
distribution function. Find a sequence of positive numbers {cn }, a constant µ ∈ R, and a
positive constant v > 0, such that
  d
cn Φ(X n ) − µ →
− N(0, v) as n → ∞,

d
where →
− means the convergence in distribution, and N(0, v) is the normal distribution with
mean 0 and variance v.

2. Let X1 , . . . , Xn be independent and identically distributed random variables with the uniform
distribution on the interval (0, θ ), for θ > 0. Let

2 n
T1n = ∑ Xi, T2n = X(n) = max(X1 , . . . , Xn )
n i=1

be two different estimators of θ .

(i) Find the asymptotic distribution of T1n .

d d
(ii) Show that n (θ − T2n ) →
− Exp(θ ) as n → ∞, where →
− means the convergence in distribution,
and Exp(θ ) is the exponential distribution with density fθ (x) = θ −1 e−x/θ for x > 0.

(iii) Find the mean squared error (MSE) of T1n and T2n respectively, for each finite n ≥ 1. Which
estimator has a smaller MSE?

(iv) Find the UMVUE of θ 2 .


PAGE 3 ST5215

3. Let X1 , . . . , Xm be independent and identically distributed random variables with the normal
distribution N(0, θ 2 ). Let Y1 , . . . ,Yn be independent and identically distributed random vari-
ables with the normal distribution N(0, 1/θ 2 ). Suppose that the Xi ’s and Y j ’s are independent.
θ > 0 is a parameter. Let T1 = ∑m 2 n 2
i=1 Xi , T2 = ∑ j=1 Y j , and T = (T1 , T2 ).

(i) Show that T is the minimal sufficient statistic for θ .

(ii) Show that each of T1 and T2 is a complete statistic for θ .

(iii) Show that S = T1 T2 is an ancillary statistic.

(iv) Show that T is not a complete statistic.

4. Let X1 , . . . , Xn be independent and identically distributed random variables with the Gamma(2, θ )
distribution whose probability density function is given by

fθ (x) = θ 2 xe−θ x ,

for x > 0. θ > 0 is a parameter. Assume that n ≥ 3. Define two statistics


1 n 1 2n
T1n = ∑ Xi , T2n = .
n i=1 ∑ni=1 Xi


Hints: The Gamma(α, β ) distribution (α > 0, β > 0) has the density function
β α α−1 −β x
fα,β (x) = x e , for x > 0.
Γ(α)
R ∞ k−1 −t
The gamma function satisfies Γ(k) = 0 t e dt = (k − 1)! for k = 1, 2, . . ..
If Y j ( j = 1, . . . , m) are independent Gamma(α j , β ) random variables, then ∑mj=1 Y j follows
Gamma(α̃, β ) with α̃ = ∑mj=1 α j .


(i) Find a minimal sufficient and complete statistic for θ .

(ii) Show that both T1n and T2n converge to θ almost surely as n → ∞.

(iii) Find the UMVUE of θ .

(iv) Find the Cramer-Rao lower bound for the unbiased estimators of θ . Does the UMVUE of
θ achieve this lower bound?

– END OF PAPER –

Вам также может понравиться