Tải bản đầy đủ (.pdf) (5 trang)

DISCRETE-SIGNAL ANALYSIS AND DESIGN- P25 pot

Bạn đang xem bản rút gọn của tài liệu. Xem và tải ngay bản đầy đủ của tài liệu tại đây (151.7 KB, 5 trang )

106 DISCRETE-SIGNAL ANALYSIS AND DESIGN
Equation (6-12) is assumed, as usual, to be one record of a steady-state
repetitive sequence. Note that the “ßip” of x(n) does not occur as it did in
Eq. (5-4) for convolution. We only want to compare the sequence with an
exact time-shifted replica. Note also the division by N because C
A
(τ)is
by deÞnition a time-averaged value for each τ and convolution is not. As
such, it measures the average power commonality of the two sequences as
a function of their separation in time. When the shift τ =0, C
A
(τ) =C
A
(0)
and Eq. (6-12), reduces to Eq. (6-5), which is by deÞnition the average
power for (x +ε
x
)
n
.
Figure 6-4 is an example of the autocorrelation of a sequence in part (a)
(no noise) and the identical shifted (τ =13) sequence in part b, c. There
are three overlaps, and the values of the autocorrelation vs overlap, which
is the sum of partial products (polynomial multiplication), are shown in
part (c). The correlation value for τ =13 is
C
A
(13) =
(1)(0.1875) + (0.9375)(0.125) +(0.875)(0.0625)
16
= 0.0225


This value is indicated in part (c), third from the left and also third from
the right. This procedure is repeated for each value of τ.Atτ =0, parts
(a) and (b) are fully overlapping, and the value shown in part (c) is 0.365.
For these two identical sequences, the maximum autocorrelation occurs
at τ =0 and the value 0.365 is the average power in the sequence.
Compare Fig. 6-4 with Fig. 5-4 to see how circular autocorrelation
is performed. We can also see that x
1
(n)andx
2
(n) have 16 positions
and the autocorrelation sequence has 33 =(16 +16 +1) positions, which
demonstrates the same smoothing and stretching effect in auto correla-
tion that we saw in convolution. As we decided in Chapter 5, the extra
effort in circular correlation is not usually necessary, and we can work
around it.
Cross-Correlation
Two different waveforms can be completely or partially dependent or
completely independent. In each of these cases the two noise- contaminated
waveforms are time-shifted with respect to each other in increments of τ.
PROBABILITY AND CORRELATION 107
N := 16 n := −N, −N + 1 N
(x1(n)
⋅x2(τ + n))
τ := −N, −N + 1 N
x1(n)
x
2(n + 13)
z(τ)
0

1− if n ≥ 0
0 if n > N
x2(n) :=
n
if n ≥ 0
x1(n) :=
0
1 −
0 if n > N
z (τ) :=



−15 −10 −50
(a)
51015
n
−15 −10 −50
(b)
(c)
51015
τ
−15 −10 −50 10515
0
0.5
1
1
0
0.5
1

0
0.1
0.2
0.3
0.4
1.0
0.9375
0.875
0.0225
0.0225
0.0625
0.125
0.1875
0.365
n
16

N−1
n = 0
1
N
n
16
Figure 6-4 Example of autocorrelation.
108 DISCRETE-SIGNAL ANALYSIS AND DESIGN
Equation (6-13) is the basic equation for the cross-correlation of two
different waves x (n)andy(n):
C
C
(τ) =

1
N
N−1

n=0

(x +ε
x
)
n
(y +ε
y
)
(n+τ)

(6-13)
We have pointed out one major difference between the correlation and
convolution equations. In correlation there is no “ßip” of one of the waves,
as explained in Chapter 7. This is in agreement with the desire to compare
a wave with a time-shifted replica of itself or a replica of two different
waves, one of which is time-shifted with respect to the other. In the case
of convolution we derived a useful relationship for the Fourier transform
of convolution. In Chapter 7, correlation leads to another useful idea in
linear analysis, called the Wiener-Khintchine (see Google, e.g.) principle.
Figure 6-5 (with no noise) is an example of cross-correlation. The two
time-domain sequences can have different lengths, different shapes, and
different amplitude scale factors. The maximum value of cross-correlation
occurs at τ =−3and−4, which is quite a bit different from Fig. 6-4.
At τ =0 the correlation is 0.096, and at τ =−3and−4 the correla-
tion is about 0.149, so the correlation in the overlap area increases 10

log(0.149/0.096) =1.90 dB. Recall that for each value of τ theareaof
overlap (sum of products as in Fig. 6-4) of the two sequences represents
a value of common power. This value is the power that the two different
waves deliver in combination.
The correlation sequences in Figs. 6-4 and 6-5 are τ-domain power
sequences. These power sequences can also have complex (real watts
and imaginary vars) frequency-domain components, just like any other
time-domain sequence. The result is a power spectrum (Chapter 7) of
correlation parameter τ.
AUTOCOVARIANCE
The calculation of autocorrelation can produce an average term, perhaps
dc, which may not be useful or desired for statistical analysis reasons and
should be eliminated. To accomplish this, autocovariance equation (6-14)
PROBABILITY AND CORRELATION 109
x1(n) := x2(n) :=0
1 − exp (n.0.25) if n ≥ 0
0 if n >
0
exp (−n
⋅0.25) if n > 0
0 if n > N
0
0.5
1
0
0.5
1
−15 −10 −5
0 5 10 15
0

0.05
0.1
0.15
N := 16 n := −N, −N + 1 N τ := −N, −N + 1 N
N
2
[x1(n)
.
x2(τ + n)]z (τ) :=



N−1
n = 0
1
N

x1(n)
x
2(n + 15)
n
τ
(b)
(
c
)
−15 −10 −5
0 5 10 15
n
(a)

−15 −10 −5
0 5 10 15
z(τ)
Figure 6-5 Example of cross-correlation.
110 DISCRETE-SIGNAL ANALYSIS AND DESIGN
removes the time average n
x
 and the result is a good approximation
to the ac value expected. Many repetitions of Eq. (6-14), followed by
averaging, can greatly improve the accuracy. This equation also leads to
an ac energy or power result as a function of τ.Ifτ =0, the result is the
average ac signal plus noise power in the x(n) signal.
C
acv
(τ) =
1
N
N−1

n=0

[
(x +ε)
n


n
x

]

·

(x +ε)
(n+τ)


n
x


(6-14)
An example of autocovariance is the same as Fig. 6-4, which has been
modiÞed to remove the dc component.
Cross-Covariance
The same modiÞcation of the cross-correlation of two separate waves,
x(n)andy(n), eliminating n
x
 and n
y
, produces the cross-covariance
C
CCV
(τ) =
1
N
N−1

n=0

[

(x +ε
x
)n −

n
x

]
·

(y +ε
y
)
(n+τ)


n
y

(6-15)
The cross-covariance is the ac signal power plus noise power that is
common to x(n)andy(n) as a function of shift τ. The result is the
relatedness of the two ac signal powers. At any value of τ the result is
the total ac power for that τ. Again, the result should be averaged over
many repetitions.
Correlation CoefÞcient
This is an important dimensionless number in statistics, in addition to
those just considered. Its value lies between −1.0 and +1.0 and it is a
measure of the “relatedness” in some sense (to be decided by the user)
between two possibly noise-contaminated sequences x(n)andy(n). The

value −1.0 means “negatively” related, +1.0 means “positively” related,

xy
|=1 means completely related one way or the other, and 0 means that
x(n)andy(n) are completely unrelated (independent). The basic equation

×