You are on page 1of 3

Statistical Computing

Homework 3
Stephen Downing
14 March 2014
3.1 Question 1
3.1.1 (a)
Since Y has a density g(y), then calculating the marginal (i.e., not conditioned on P(U
f(Y )
cg(Y )
) but originating from the geometric distribution) probability p for an accepted rv yields
(from [1]):
p =

f(y)
cg(y)
g(y)dy (1)
=
1
c

f(y)dy
=
1
c
The probability of acceptance is thus
1
c
.
3.1.2 (b)
The expected number of trials until acceptance is E(N) = c. Additionally, since the number
of rejections should be minimized for computational eciency, c should be the smallest value
that satises the e bounded inequality 0 <
f(Y )
cg(Y )
1, which is the minimum c such that
cg(x) still majorizes f(x): c = sup
x
{
f(x)
g(x)
}.
3.1.3 (c)
Assume the cdf of the rv X, F(X) = P(X x), is unknown. Denote the suitable proposal
rv to be used in the Acceptance-Rejection (A-R) algorithm Y G (i.e., Y follows the
distribution G), along with U u
[0,1]
, and some constant c that satises:
U
1
c
f(X)
g(X)
.
1
Let A = Y y, the total area (or density) in which the second uniform deviate can be
located, and B = U
f(Y )
cg(Y )
, the acceptance region with B A. Given that Bayes Theorem
tells us P(A|B) = [P(B|A)P(A)]/P(B), and knowing P(B) = p =
1
c
, then we can show
that the conditional distribution of Y given U
f(Y )
cg(Y )
actually follows the distribution from
which we intended to sample, Y F, and therefore has the same distribution as the accepted
sample X. First we simplify the conditional probability (e.g., by [1], [2]).
P(U
f(y)
cg(y)
|Y y) =
P(U
f(Y )
cg(Y )
, Y y)
G(y)
(2)
=

P(U
f(Y )
cg(Y )
|Y = w y)
G(y)
g(w)dw
=
1
G(y)

f(w)
cg(w)
g(w)dw
=
1
cG(y)

f(w)dw
=
F(y)
cG(y)
And then this result is used [1] to show the conditional distribution of Y is F:
P(U
f(y)
cg(y)
|Y y)
G(y)
1/c
=
F(y)
cG(y)

cG(Y )
1/c
(3)
= F(y).
3.2 Question 2
3.2.1 (a)
The A-R algorithm involves selecting pairs of rvs (X, U), whereby the X drawn from the
proposal distribution that is proportional to the distribution from which we are simulating
the sample and the U to determine whether to accept or reject. It can be illustrative to
explain the steps of the A-R algorithm in terms of a two-dimensional graph on which the
rv tuples represent coordinates [3]. All points that fall in the area where Ucg(x) f(x)
are accepted while those points where Ucg(x) > f(x) are rejected. The A-R algorithm for
generating 1000 deviates from a gamma distribution employing a mixture density in the
majorizing function h(x) = cg(x) is as follows [4]:
(1) While the accepted deviates are less than 1000; generate u
1
u
[0,1]
; set b =
(e+)
e
(e Eulers constant); set p = u
1
b
(1.1) If p 1 go to (2)
(1.2) If p > 1 go to (3)
(2) Set x = p
(1/)
; generate u
2
u
[0,1]
(2.1) If u
2
e
x
, accept u
2
; increment gamma deviate counter
(2.2) Otherwise, reject; go to (1)
2
(3) Set x = log(b ()); generate u
2
u
[0,1]
(3.1) If u
2
x
1
, accept u
2
; increment gamma deviate counter
(3.2) Otherwise, reject; go to (1)
(4) increment iteration counter; go to (1).
3.2.2 (b)
According to [4](p.213-214), given that the sampling mixture q(x) dictates that bu u
[0,1]
when bu 1 and
(e+)ue

u
[0,1]
when bu > 1, then the expected number of iterations per
accepted deviate is E[j] = (e +)[e()]. Using = 0.8 the expected number of iterations
per acceptance c = 8.907 and the accepted ratio
1
c
= 0.112.
3.2.3 (c)
To compare the A-R generated sample with the R function for gamma distribution:
- - - 10% 20% 30% 40% 50% 60% 70% 80% 90%
Qsamp 0.07 0.14 0.21 0.28 0.36 0.43 0.52 0.63 0.75
Qhat 0.04 0.14 0.25 0.36 0.50 0.71 0.99 1.36 2.03
References
[1] Sigman, Karl, Acceptance-Rejection Method, Columbia University, 2007
[2] Robert, Christian P. and Casella, George, Introducing Monte Carlo Methods with R,
Springer, New York, NY, 2010
[3] Press, William H., Flannery, Brian P., Teukolsky, Saul A., and Vetterling, William T.,
Numerical Recipes in C: The Art of Scientic Computing, 2nd Ed., University of Cam-
bridge Press, Cambridge, 1992
[4] Kennedy, Jr., William J. and Gentle, James E., Statistical Computing, Marcel Dekker,
Inc., New York, NY, 1980.
3