Академический Документы
Профессиональный Документы
Культура Документы
n0
_
sQ[0,t]
_
d(X
s
, B) <
1
n
_
(1)
On one hand, when () t, then d(X
n=1
_
qQ[0,t+
1
n
)
{X
q
B}
On one hand, for each , if there exists q < t + 1/n such that X
q
B, then
t + 1/n. If this is true for every n then t.
In the other direction, if t, then for any n, there exists a time t
n
< t + 1/n
such that X
tn
B, since B is open, there exists a rational number q
n
< t + 1/n
1
sufciently close to t
n
such that X
qn
B. This shows would belong to the right
hand side.
Notice that the set
A
n
=
_
qQ[0,t+
1
n
)
{X
q
B} F
t+
1
n
is decreasing in n. So
n=1
A
n
=
n=m
A
n
F
t+
1
m
This indicates { t} F
t+1/m
for any m. Therefore
{ t}
m=1
F
t+
1
m
= F
t+
= F
t
by the right continuity of the ltration. So is a stopping time.
b. Kudos to Kevin Webster for giving this counterexample.
Let = {
1
,
2
}. G = 2
t
= (Y
u
: u t)
2
Since X
t
= Y
t
Y
s
. Therefore
E[X
t
|F
s
] = E[Y
t
Y
|F
s
]
= E[Y
t
Y
s
|F
s
] + (Y
s
Y
)
= E[Y
t
Y
s
] +Y
s
Y
= X
s
So (X
t
)
t
is a martingale.
(b) This basically comes from the fact that
Cov(Y
t
, Y
s
) = min(t, s)
(c) Similar to the proof of (a), we can see that (Y
t
)
t
is also a martingale, adapted
to (F
t
)
t
:
E[Y
t
|F
s
] = E[Y
t
Y
s
|F
s
] +Y
s
= Y
s
So Y
2
t
is a submartingale. From Doobs maximal inequality,
P
_
sup
st
|Y
t
|
2
>
_
E[Y
2
t
]
=
t
So
n=1
P
_
sup
0<s1/n
3
|Y
t
|
2
>
1
n
_
n=1
1
n
2
<
By Borel-Cantelli lemma, this suggests |Y
t
|
2
0 as t 0, a.s. Hence
P
_
lim
t0
Y
t
= 0
_
= 1
3
Exercise 4.6 FromQ.3 we know is a stopping time. So the stopped process (X
t
)
t0
is also a martingale. B
t
is bounded above by A, so
X
t
= exp(B
t
2
t/2) exp(A)
which is a constant. Using dominated convergence theorem we conclude
EX
= lim
t
EX
t
= X
0
= 1
Or
1 = Eexp(B
2
/2)
= E[A
2
/2|B
= A]P(B
= A)
+E[A
2
/2|B
= A]P(B
= A)
Now notice that B
t
and B
t
are both Brownian motions, so the symmetry of the
boundaries indicate that the distribution of conditioned on B
= A is the same as
conditioned on B
= A, or B
= A) (|B
= A)
Also by symmetry P(B
= A) = P(B
= A) = 1/2. So
1 =
1
2
_
e
A
Eexp(
2
/2) +e
A
Eexp(
2
/2)
_
So for any 0, let =
2
/2.
E[exp()] =
2
exp(
2A) + exp(
2A)
From Theorem 4.5 in the book, we know has nite moments of all orders. So from
dominated convergence,
E[
2
] = lim
0
E[
2
exp()]
= lim
0
E
_
d
2
d
2
exp()
_
= lim
0
d
2
d
2
E[exp()]
The exchange of derivative and expectation here is valid, due to a proof already shown
in class.
In the end one can get the result by directly taking the derivative. I choose to present a
method with Taylor expansion here. Expand the Laplace transform at = 0, we have
exp(
2A) + exp(
2A) =
k=0
_
(2)
k/2
A
k
k!
+ (1)
k
(2)
k/2
A
k
k!
_
= 2
k=0
(2A
2
)
k
(2k)!
k
= 2
_
1 +A
2
+
A
4
6
2
+
_
4
So
2
exp(
2A) + exp(
2A)
=
1
1 +A
2
+
A
4
6
2
+
= 1A
2
+
5A
4
6
2
+
and
lim
0
d
2
d
2
E[exp()] =
5
3
A
4
Validity of such this approach can be seen from the fact that E[exp()] equals the
hyperbolic secant of
2A)
which is complex-analytic on the positive real line (0, ).
The symmetry is crucial here: without it we wont be able to obtain the closed form
formula of E[exp()].
5