Академический Документы
Профессиональный Документы
Культура Документы
Techniques
Content
1
Antithetic Variables
2 Control Variates
3 Conditioning Sampling
5 Importance Sampling
Introduction
Recall we estimate the unknown quantityθ= E(X) by
generating random numbers X1,. . . , Xn , and use
to estimateθ.
Step1:
X1 = h(U1,. . . ,Um)
where U1, …, Um i.i.d. ~ U(0,1), and h is a monotone function
of each of its coordinates.
Step2:
X2 =h(1 – U1., . . , 1 – Um )
which has the same distribution as X1.
What Does “Antithetic” Mean?
“Antithetic” means:
Opposed to, or
The opposite of, or
Negatively correlated with…
then form the average of these two values, using the average as
a single observation.
Advantage
The estimator have smaller variance (at least when h is
a monotone function)
where
Example1
An antithetic variable is
[−ln(1−Ui)]0.9
so an unbiased combined
n
estimator is
1
n
(1/2){[−ln(U
i 1
i )]0.9+[−ln(1−U )]0.9}
i
Additional Example
Consider the case where we want to estimate E(X2) where
X ~Normal(2, 1). How to use antithetic variables to estimate
and improve its variance.
n.sim=5000; %set the numbers to simulate
out1=(2+randn(1,n.sim)).^2;%generate a Normal random variable (2,1) and square it
mean(out1)
var(out1)
where
and
The approximation of c*
Several Variables as a Control
= -12*0.14086= -1.6903
Matlab code:
%Antithetic estimator
Xa = (exp(U.^2)+exp((1-U).^2))/2;
Xabar = sum(Xa)/m;
VarXa = var(Xa)/m
The variance of the antithetic variable estimator (using the same
U) was: Var(Xa) = 2.7120 × 10−4.
d) It is clear from part (c) that it is better to use the control variable
method.
Conditioning sampling
Variance Reduction by Conditioning
Review: Conditional Expectation:
E[X|Y] denotes that function of the random variable Y whose
value at Y=y is E[X|Y=y].
Set
-1 0 1
E[I] = π/4.
-1
Use E[I|V1] rather than I to estimate π/4.
Example 5: Estimate π
Step 2: Evaluate each (1Vi2 )1/ 2 and take the average of all these
values to estimate π/4.
Matlab code:
n=1000;
m=1000;
u1=rand(n,m);
v1=2*u1-1;
v=(1-v1.^2).^0.5;
theta=4*sum(v)/n;
msecv=sum((theta-pi).^2)/m; % reduction in variance
reduction=1-msecv/mse0
Example 5: Estimate π
Matlab program for comparison of two simulation procedure:
n=1000;
m=1000;
u1=rand(n,m);
v1=2*u1-1;
% ------------raw simulation-------------------------
v2=2*rand(n,m)-1;
s=v1.^2+v2.^2<=1;
theta0=4*sum(s)/(n);
mse0=sum((theta0-pi).^2)/m;
% ----------conditioning sampling-------------------------
v=(1-v1.^2).^0.5;
theta=4*sum(v)/n;
msecv=sum((theta-pi).^2)/m; % reduction in variance
reduction=1-msecv/mse0
Example 6:
Raw simulation:
Step1: generate Y = - log(U), where U ~ Uni(0, 1)
Step2: if Y= y, generate X ~ N (y, 4)
Step3: set
then E[I]=θ
Example 6:
Can we express the exact value of E(I | Y=y) in terms of y?
Improvement:
If Y= y, is a standard normal r.v..
Then,
where (x)1(x) .
1 y
Therefore, the average value of ( ) obtained over many runs
2
is superior to the raw simulation estimator.
Example 6:
Procedure 2:
Step1: Generate Y i= - ln(Ui), i=1…n, where Ui i.i.d. ~ U(0, 1).
Step 2: Evaluate each (12Yi ) and take the average of all these
values to estimate θ.
Matlab code:
n=1000;
EIy=zeros(1,n);
for i=1:n
y=exprnd(1);
EIy(i)=1-normcdf((1-y)/2);
end
theta=mean(EIy)
Example 6:
Can we use antithetic variables to improve simulation?
Further Improvement:
Using antithetic variables
Example 6:
Procedure 3:
Raw simulation
Step1: generate N, say N = n
Step2: generate the values of X1, …, Xn
Step3: set
then E[I] = p
Example 7:
Improvement by conditioning
Introduce a random variable M
m
Step1: Generate Xi in sequence, stopping when S X i c .
i 1
Code:
n=100;c=325000;I=zeros(1,n);
for i=1:n
s=0;m=0;
while s<c
x=exprnd(1000); % exponential with mean 1000
s=s+x;
m=m+1;
end
p=1-poisscdf(m-1,300); % poisson with rate of 10 per day
I(i)=p;
end
p_bar=sum(I)/n