Tải bản đầy đủ (.pdf) (11 trang)

Mô hình hóa và nhận dạng hệ thống - random process 2

Bạn đang xem bản rút gọn của tài liệu. Xem và tải ngay bản đầy đủ của tài liệu tại đây (403.74 KB, 11 trang )

Lecture 2
Stochastic Processes
2.1 Introduction
Spectral analysis is the study of models of a class called stationary stochastic
processes.
Stochastic processes {X(t) : t ∈ T } is a family or rv’s indexed by a variable t,
where t is a subset of T which may be infinite.
continuous → X(t)
discrete → X
t



These may be real, vector valued or
complex with suitable added indices.
We will use the Riemann-Stieltjes notation in what follows because mixed
continuous- discrete distributions are common for time series, and hence the R-S
notation is standard in stochastic theory. Let g(x) and H(x) be real valued func-
tions on [L, U] where L < U and L, U may be −∞, ∞ with suitable limiting
processes. Let P
N
be a partition of [L, U] into N + 1 intervals
L = x
0
< x
1
· · · < x
N
= U
Define the mesh fineness:
|P


N
| = max {x
1
− x
0
, x
2
− x
1
, . . . , x
N
− x
N−1
}
1
2 LECTURE 2. STOCHASTIC PROCESSES
Then

U
L
g(x)dH(x) = lim
|P
N
|→0
N

j=1
g(x

j

)[H(x
j
) − H(x
j−1
)]
where x

j
∈ [x
j−1
, x
j
]. There are 3 cases
1. If H(x) = x then we have the Riemann integral

U
L
g(x)dx
2. If H(x) is continuously differentiable on [L, U] with h(x) = ∂
x
H(x), then

U
L
g(x)dH(x) =

U
L
g(x)h(x)dx
3. If H(x) undergoes step changes of size b

i
at a
i
on [L, U ] so that
H(x) = c
i
L ≤ x < a
i
H(x) = c
i
+ b
i
a
i
≤ x ≤ U
c
i
a
i
c + b
i
i
b
i
then

U
L
g(x)dH(x) =
N


j=1
b
i
g(a
i
)
Example 1 For a continuous process, we have
f(x) = ∂
x
F (x)
and hence:
E[X] =


−∞
xdF (x) =


−∞
xf(x)dx
Example 2 For a discrete process where the cdf F (x) undergoes discrete jumps
of size
1
N
at a set of values {x
i
}:
E[X] =



−∞
xdF (x) =
1
N
N

j=1
x
i
Example 3 For a fixed value of t, X
t
is an rv and hence has a cdf, where
F
t
(a) = P [X
t
≤ a]
2.1. INTRODUCTION 3
with
E[X] =


−∞
xdF (x) = µ
t
var[X] =


−∞

(x − µ
t
)dF (x) = σ
2
t
Note that the statistics become time dependent. We also may need higher order
cdf’s like the bivariate for two times:
F
t
1
,t
2
(a
1
, a
2
) = P [X
t
1
≤ a
1
, X
t
2
≤ a
2
]
and the N dimensional generalization:
F
t

1
,... ,t
N
(a
1
, . . . , a
N
) = P [X
t
1
≤ a
1
, . . . , X
t
N
≤ a
N
]
The set of cdf’s from F
t
to F
t
1
,... ,t
N
are a complete description of the stochastic
process if we know them for all t and N. However, the result is a mess and the
distributions are unknowable in practice.
We can start to narrow this down by considering stationary processes: one
whose statistical properties are independent of time, or a physical system which is

steady state.
If {X
t
} is a result of a stationary process, then each element must have the
same cdf and F
t
(x) → F (x). Any pair of elements in {X
t
} must have the same
bivariate distribution, etc. In summary, the joint cdf of {X
t
} for a set of N time
points {t
i
} must be unaltered by time shifts.
There are several cases of stationarity:
Complete stationarity
If the joint cdf of {X
t
1
, . . . , X
t
N
} is identical to that for {X
t
k+1
, . . . , X
t
k+N
} for

any k, then it is completely stationary. All of the statistical structure is unchanged
under shifts in the time origin. This is a severe requirement and rarely establishable
in practice.
Stationarity of order 1
4 LECTURE 2. STOCHASTIC PROCESSES
E[X
t
] = µ for ∀t. No other stationarity is implied.
Stationarity of order 2
E[X
t
] = µ and E[X
2
t
] = µ
2
, so that the mean and the variance are time
independent.
E[X
t
X
s
] is a function of |t− s| only and hence cov[X
t
, X
s
] is a function of |t− s|
only.
This class is called weakly stationary or second order stationary, and is the most
important type of stochastic process for our purposes.

For a second order stationary process, we define the autocovariance sequence by
S
τ
= cov[X
t
, X
t+τ
] = cov[X
0
, X
τ
]
This is a measure of the covariance between members of the process separated
by τ time units. τ is called the lag. We would expect S
τ
to be largest at τ = 0
and be symmetric about the origin.
2.2 Properties of the acvs
1. S
0
= σ
2
2. S
−τ
= S
τ
(even function)
3. |S
τ
| ≤ S

0
for τ > 0
4. S
τ
is positive semidefinite

N
j=1

N
k=1
S
t
j
−t
k
a
j
a
k
≥ 0 for {a
1
, . . . , a
N
} ∈ 
or in matrix form a
T

Σa ≥ 0
where


Σ is the covariance matrix.
The autocorrelation sequence is the acvs normalized to S
0
ρ
τ
=
S
τ
S
0
2.3. EXAMPLES OF STATIONARY PROCESSES 5
and has properties:
1. ρ
0
= 1
2. ρ
−τ
= ρ
τ
for τ > 0
3. |ρ
τ
| ≤ 1 for τ > 0
4. ρ
τ
is positive semidefinite
Note that a completely stationary process is also second order stationary, but
second order stationarity does not imply complete stationarity. However, if the
process is Gaussian (i.e, the joint cdfs of the rv’s are multivariate normal) then

second order stationarity does imply complete stationarity because a Gaussian
distribution is completely specified by its first and second moments.
All of this machinery extends to complex processes. Let Z
t
= X
t,1
+ iX
t,2
. This
is second order stationary if all of the joint first and second order moments of X
t,1
and X
t,2
exist, are finite, and are invariant to shifts in time. This implies that X
t,1
and X
t,2
are themselves second order stationary. We have
E[Z
t
] = µ
1
+ iµ
2
= µ
cov[Z
t
1
, Z
t

2
] = E[(Z
t
1
− µ)

(Z
t
2
− µ)] = S
τ
and hence S
−τ
= S

τ
for a complex process.
2.3 Examples of stationary processes
Let {X
t
} be a sequence of uncorrelated rv’s such that
E[X
t
] = µ
var[X
t
] = σ
2
cov[X
t

, X
t+τ
] = 0 (follows from uncorrelatedness)

×