Chapter 29
Gaussian processes
Definition 29.1 (Gaussian Process) A Gaussian process
X t
,
t 0
, is a stochastic process with
the property that for every set of times
0 t
1
t
2
::: t
n
, the set of random variables
X t
1
;Xt
2
;::: ;Xt
n
is jointly normally distributed.
Remark 29.1 If
X
is a Gaussian process, then its distribution is determined by its mean function
mt=IEXt
and its covariance function
s; t=IEXs,ms X t , mt:
Indeed, the joint density of
X t
1
;::: ;Xt
n
is
IP fX t
1
2 dx
1
;::: ;Xt
n
2 dx
n
g
=
1
2
n=2
p
det
exp
n
,
1
2
x , mt
,1
x , mt
T
o
dx
1
::: dx
n
;
where
is the covariance matrix
=
2
6
6
6
4
t
1
;t
1
t
1
;t
2
::: t
1
;t
n
t
2
;t
1
t
2
;t
2
::: t
2
;t
n
::: ::: ::: :::
t
n
;t
1
t
n
;t
2
::: t
n
;t
n
3
7
7
7
5
x
is the row vector
x
1
;x
2
;::: ;x
n
,
t
is the row vector
t
1
;t
2
;::: ;t
n
,and
mt=mt
1
;mt
2
;::: ;mt
n
.
The moment generating function is
IE exp
n
X
k=1
u
k
X t
k
= exp
n
u mt
T
+
1
2
u u
T
o
;
where
u =u
1
;u
2
;::: ;u
n
.
285
286
29.1 An example: Brownian Motion
Brownian motion
W
is a Gaussian process with
mt=0
and
s; t=s^t
. Indeed, if
0 s t
,
then
s; t=IEWsWt = IE
h
W sWt , Ws + W
2
s
i
= IEW s:IE W t , W s + IEW
2
s
= IEW
2
s
= s ^ t:
To prove that a process is Gaussian, one must show that
X t
1
;::: ;Xt
n
has either a density or a
moment generating function of the appropriate form. We shall use the m.g.f., and shall cheat a bit
by considering only two times, which we usually call
s
and
t
. We will want to show that
IE exp fu
1
X s+u
2
Xtg = exp
u
1
m
1
+ u
2
m
2
+
1
2
u
1
u
2
"
11
12
21
22
"
u
1
u
2
:
Theorem 1.69 (Integral w.r.t. a Brownian) Let
W t
be a Brownian motion and
t
a nonran-
dom function. Then
X t=
Z
t
0
udW u
is a Gaussian process with
mt=0
and
s; t=
Z
s^t
0
2
udu:
Proof: (Sketch.) We have
dX = dW:
Therefore,
de
uX s
= ue
uX s
s dW s+
1
2
u
2
e
uX s
2
s ds;
e
uX s
= e
uX 0
+ u
Z
s
0
e
uX v
v dW v
| z
Martingale
+
1
2
u
2
Z
s
0
e
uX v
2
v dv ;
IEe
uX s
=1+
1
2
u
2
Z
s
0
2
vIEe
uX v
dv ;
d
ds
IEe
uX s
=
1
2
u
2
2
sIEe
uX s
;
IEe
uX s
= e
uX 0
exp
1
2
u
2
Z
s
0
2
v dv
(1.1)
= exp
1
2
u
2
Z
s
0
2
v dv
:
This shows that
X s
is normal with mean 0 and variance
R
s
0
2
v dv
.
CHAPTER 29. Gaussian processes
287
Now let
0 st
be given. Just as before,
de
uX t
= ue
uX t
t dW t+
1
2
u
2
e
uX t
2
t dt:
Integrate from
s
to
t
to get
e
uX t
= e
uX s
+ u
Z
t
s
v e
uX v
dW v +
1
2
u
2
Z
t
s
2
ve
uX v
dv :
Take
IE :::jF s
conditional expectations and use the martingale property
IE
Z
t
s
v e
uX v
dW v
F s
= IE
Z
t
0
v e
uX v
dW v
F s
,
Z
s
0
v e
uX v
dW v
=0
to get
IE
e
uX t
F s
= e
uX s
+
1
2
u
2
Z
t
s
2
v IE
e
uX v
F s
dv
d
dt
IE
e
uX t
F s
=
1
2
u
2
2
tIE
e
uX t
F s
; t s:
The solution to this ordinary differential equation with initial time
s
is
IE
e
uX t
F s
= e
uX s
exp
1
2
u
2
Z
t
s
2
v dv
; t s:
(1.2)
We now compute the m.g.f. for
X s;Xt
,where
0 s t
:
IE
e
u
1
X s+u
2
X t
F s
= e
u
1
X s
IE
e
u
2
X t
F s
1.2
= e
u
1
+u
2
X s
exp
1
2
u
2
2
Z
t
s
2
v dv
;
IE
h
e
u
1
X s+u
2
X t
i
= IE
IE
e
u
1
X s+u
2
X t
F s
= IE
n
e
u
1
+u
2
X s
o
: exp
1
2
u
2
2
Z
t
s
2
v dv
(1.1)
= exp
1
2
u
1
+ u
2
2
Z
s
0
2
v dv +
1
2
u
2
2
Z
t
s
2
v dv
= exp
1
2
u
2
1
+2u
1
u
2
Z
s
0
2
vdv +
1
2
u
2
2
Z
t
0
2
v dv
= exp
1
2
u
1
u
2
"
R
s
0
2
R
s
0
2
R
s
0
2
R
t
0
2
"
u
1
u
2
:
This shows that
X s;Xt
is jointly normal with
IEXs=IEXt= 0
,
IEX
2
s=
Z
s
0
2
vdv ; IEX
2
t=
Z
t
0
2
vdv;
IE X sX t =
Z
s
0
2
v dv :
288
Remark 29.2 The hard part of the above argument, and the reason we use moment generating
functions, is to prove the normality. The computation of means and variances does not require the
use of moment generating functions. Indeed,
X t=
Z
t
0
udW u
is a martingale and
X 0 = 0
,so
mt=IEXt= 0 8t 0:
For fixed
s 0
,
IEX
2
s=
Z
s
0
2
vdv
by the Itˆoisometry.For
0 s t
,
IE X sX t , X s=IE
IE
XsX t , X s
F s
= IE
2
6
6
6
4
X s
IE
X t
F s
, X s
| z
0
3
7
7
7
5
=0:
Therefore,
IE X sX t = IE X sX t , X s + X
2
s
= IEX
2
s=
Z
s
0
2
vdv :
If
were a stochastic proess, the Itˆoisometrysays
IEX
2
s=
Z
s
0
IE
2
v dv
and the same argument used above shows that for
0 s t
,
IE X sX t = IEX
2
s=
Z
s
0
IE
2
v dv :
However, when
is stochastic,
X
is not necessarily a Gaussian process, so its distribution is not
determined from its mean and covariance functions.
Remark 29.3 When
is nonrandom,
X t=
Z
t
0
udW u
is also Markov. We proved this before, but note again that the Markov property follows immediately
from (1.2). The equation (1.2) says that conditioned on
F s
, the distribution of
X t
depends only
on
X s
; in fact,
X t
is normal with mean
X s
and variance
R
t
s
2
v dv
.
CHAPTER 29. Gaussian processes
289
y
t
y = z
z
s
z
s
v = z
v
y
z
t
s
y = z
v
(a)
(b)
(c)
Figure 29.1: Range of values of
y; z; v
for the integrals in the proof of Theorem 1.70.
Theorem 1.70 Let
W t
be a Brownian motion, and let
t
and
ht
be nonrandom functions.
Define
X t=
Z
t
0
udW u; Y t=
Z
t
0
huXudu:
Then
Y
is a Gaussian process with mean function
m
Y
t=0
and covariance function
Y
s; t=
Z
s^t
0
2
v
Z
s
v
hydy
Z
t
v
hy dy
dv :
(1.3)
Proof: (Partial) Computation of
Y
s; t
:Let
0 s t
be given. It is shown in a homework
problem that
Y s;Yt
is a jointly normal pair of random variables. Here we observe that
m
Y
t=IEY t=
Z
t
0
huIEXu du =0;
and we verify that (1.3) holds.