Tải bản đầy đủ (.pdf) (33 trang)

Tài liệu 17 Cyclostationary Signal Analysis docx

Bạn đang xem bản rút gọn của tài liệu. Xem và tải ngay bản đầy đủ của tài liệu tại đây (571.61 KB, 33 trang )

Giannakis, G.B. “Cyclostationary Signal Analysis”
Digital Signal Processing Handbook
Ed. Vijay K. Madisetti and Douglas B. Williams
Boca Raton: CRC Press LLC, 1999
c

1999byCRCPressLLC
17
Cyclostationary Signal Analysis
Georgios B. Giannakis
University of Virginia
17.1 Introduction
17.2 Definitions, Properties, Representations
17.3 Estimation, Time-Frequency Links, Testing
Estimating Cyclic Statistics

Links with Time-Frequency Rep-
resentations

Testing for Cyclostationarity
17.4 CS Signals and CS-Inducing Operations
Amplitude Modulation

Time Index Modulation

Fractional
Sampling andMultivariate/Multirate Processing

Periodically
Varying Systems
17.5 Application Areas


CS Signal Extraction

Identification and Modeling
17.6 Concluding Remarks
Acknowledgments
References
17.1 Introduction
Processes encountered in statistical signal processing, communications, and time series analysis
applications are often assumed stationary. The plethora of available algorithms testifies to the need
for processing and spectral analysis of stationary signals (see, e.g., [42]). Due to the varying nature
of physical phenomena and certain man-made operations, however, time-invariance and the related
notion of stationarity are often violated in practice. Hence, study of time-varying systems and
nonstationary processes is well motivated.
Research in nonstationary signals and time-varying systems has led both to the development of
adaptive algorithms and to several elegant tools, including short-time (or running) Fourier trans-
forms, time-frequency representations such as the Wigner-Ville (a member of Cohen’s class of dis-
tributions), Loeve’s and Karhunen’s expansions (leading to the notion of evolutionary spectra), and
time-scale representations based on wavelet expansions (see [37, 45] and references therein). Adap-
tive algorithms derived from stationary models assume slow variations in the underlying system.
On the other hand, time-frequency and time-scale representations promise applicability to general
nonstationarities and provide useful visual cues for preprocessing. When it comes to nonstationary
signal analysis and estimation in the presence of noise, however, they assume availability of multiple
independent realizations.
In fact, it is impossible to perform spectral analysis, detection, and estimation tasks on signals
involving generally unknown nonstationarities, when only a single data record is available. For
instance, consider extracting a deterministic signal s(n) observed in stationary noise v(n), using
regression techniques based on nonstationary data x(n) = s(n)+ v(n), n = 0, 1,...,N−1. Unless
s(n) is finitely parameterized by a d
θ
s

× 1 vector θ
s
(with d
θ
s
<N), the problem is ill-posed because
c

1999 by CRC Press LLC
adding a new datum, say x(n
0
), adds a new unknown, s(n
0
), tobe determined. Thus, only structured
nonstationarities can be handled when rapid variations are present; and only for classes of finitely
parameterized nonstationary processes can reliable statistical descriptors be computed using a single
time series. One such class is that of (wide-sense) cyclostationary processes which are characterized
by the periodicity they exhibit in their mean, correlation, or spectral descriptors.
An overview of cyclostationary signal analysis and applications are the main goals of this sec-
tion. Periodicity is omnipresent in physical as well as manmade processes, and cyclostationary
signals occur in various real life problems entailing phenomena and operations of repetitive nature:
communications [15], geophysical and atmospheric sciences (hydrology [66], oceanography [14],
meteorology [35], and climatology [4]), rotating machinery [43], econometrics [50], and biological
systems [48].
In 1961 Gladysev [34] introduced key representations of cyclostationary time series, while in 1969
Hurd’s thesis [38] offered an excellent introduction to continuous time cyclostationary processes.
Since 1975 [22], Gardner and co-workers have contributed to the theory of continuous-time cyclo-
stationary signals, and especially their applications to communications engineering. Gardner [15]
adopts a “non-probabilistic” viewpoint of cyclostationarity (see [19] for an overview and also [36]
and [18] for comments on this approach). Responding to a recent interest in digital periodically

varying systems and cyclostationary time series, the exposition here is probabilistic and focuses on
discrete-time signals and systems, with emphasis on their second-order statistical characterization
and their applications to signal processing and communications.
The material in the remaining sections is organized as follows: Section 17.2 provides definitions,
properties, and representationsof cyclostationary processes, along with their relations with stationary
and general classes of nonstationary processes. Testing a time series for cyclostationarity and retrieval
of possibly hidden cycles along with single record estimation of cyclic statistics are the subjects
of Section 17.3. Typical signal classes and operations inducing cyclostationarity are delineated in
Section 17.4 to motivate the key uses and selected applications described in Section 17.5. Finally,
Section 17.6 concludes and presents trade-offs, topics not covered, and future directions.
17.2 Definitions, Properties, Representations
Let x(n) be a discrete-index random process (i.e., a time series) with mean µ
x
(n) := E{x(n)}, and
covariance c
xx
(n; τ):= E{[x(n) − µ
x
(n)][x(n+ τ)− µ
x
(n + τ)]}.Forx(n) complex valued, let
also ¯c
xx
(n; τ):= c
xx∗
(n; τ),where∗ denotes complex conjugation, and n, τ are in the set of integers
Z.
DEFINITION 17.1
Process x(n) is (wide-sense) cyclostationary (CS) iff there exists an integer P
such that µ

x
(n) = µ
x
(n + lP), c
xx
(n; τ) = c
xx
(n + lP; τ),or,¯c
xx
(n; τ) =¯c
xx
(n + lP; τ),
∀n, l ∈ Z. The smallest of all such P s is called the period. Being periodic, they all accept Fourier
Series expansions over complex harmonic cycles with the set of cycles defined as: A
c
xx
:= {α
k
=
2πk/P , k= 0,...,P− 1}; e.g., c
xx
(n; τ)and its Fourier coefficients called cyclic correlations are
related by:
c
xx
(n; τ)=
P−1

k=0
C

xx


P
k; τ

e
j

P
kn
FS
←→ C
xx


P
k; τ

=
1
P
P−1

n=0
c
xx
(n; τ)e
−j


P
kn
.
(17.1)
Strict sense cyclostationarity, or, periodic (non-) stationarity, can also be defined in terms of
probability distributions or density functions when these functions vary periodically (in n). But
c

1999 by CRC Press LLC
the focus in engineering is on periodically and almost periodically correlated
1
time series, since real
data are often zero-mean, correlated, and with unknown distributions. Almost periodicity is very
common in discrete-time because sampling a continuous-time periodic process will rarely yield a
discrete-time periodic signal; e.g., sampling cos(ω
c
t + θ)every T
s
seconds results in cos(ω
c
nT
s
+ θ)
for which an integer period exists only if ω
c
T
s
= 2π/P. Because 2π/(ω
c
T

s
) is “almost an integer”
period, such signals accept generalized (or limiting) Fourier expansions (see also Eq. (17.2) and [9]
for rigorous definitions of almost periodic functions).
DEFINITION 17.2
Process x(n) is (wide-sense) almost cyclostationary (ACS) iff its mean and
correlation(s) are almost periodic sequences. For x(n) zero-mean and real, the time-varying and
cyclic correlations are defined as the generalized Fourier Series pair:
c
xx
(n; τ) =

α
k
∈A
c
xx
C
xx

k
; τ)e

k
n
FS
←→
C
xx


k
; τ) = lim
N→∞
1
N
N−1

n=0
c
xx
(n; τ)e
−jα
k
n
.
(17.2)
The set of cycles, A
c
xx
(τ ) := {α
k
: C
xx

k
; τ) = 0 , −π<α
k
≤ π}, must be countable and the
limit is assumed to exist at least in the mean-square sense [9, Thm. 1.15].
Definition 17.2 and Eq. (17.2) for ACS, subsume CS Definition 17.1 and Eq. (17.1). Note that

the latter require integer period and a finite set of cycles. In the α-domain, ACS signals exhibit lines
but not necessarily at harmonically related cycles. The following example will illustrate the cyclic
quantities defined thus far:
EXAMPLE 17.1:
Harmonic in multiplicative and additive noise
Let
x(n) = s(n) cos(ω
0
n) + v(n) ,
(17.3)
where s(n), v(n) are assumed real, stationary, and mutually independent. Such signals appear when
communicating through flat-fading channels, and with weather radar or sonar returns when, in
addition to sensor noise v(n), backscattering, target scintillation, or fluctuating propagation media
give rise to random amplitude variations modeled by s(n) [33]. We will consider two cases:
Case 1: µ
s
= 0. The mean in (17.3)isµ
x
(n) = µ
s
cos(ω
0
n) + µ
v
, and the cyclic mean:
C
x
(α) := lim
N→∞
1

N
N−1

n=0
µ
x
(n)e
−jαn
=
µ
s
2
[δ(α− ω
0
) + δ(α + ω
0
)]+µ
v
δ(α) ,
(17.4)
wherein(17.4) we used the definition of Kronecker’s delta
lim
N→∞
1
N
N−1

n=0
e
jαn

= δ(α) :=

1 α = 0
0 else
.
(17.5)
1
The term cyclostationarity is due to Bennet [3]. Cyclostationary processes in economics and atmospheric sciences are
also referred to as seasonal time series [50].
c

1999 by CRC Press LLC
Signal x(n) in (17.3) is thus (first-order) cyclostationary with set of cycles A
c
x
={±ω
0
, 0}.If
X
N
(ω) :=

N−1
n=0
x(n) exp(−jωn),thenfrom(17.4)wefindC
x
(α) = lim
N→∞
N
−1

E{X
N
(α)};
thus, the cyclic mean can be interpreted as an averaged DFT and ω
0
can be retrieved by picking the
peak of |X
N
(ω)| for ω = 0.
Case 2: µ
s
= 0.From(17.3) we find the correlation c
xx
(n; τ) = c
ss
(τ )[cos(2ω
0
n + ω
0
τ)+
cos(ω
0
τ)]/2 + c
vv
(τ ). Because c
xx
(n; τ) is periodic in n, x(n) is (second-order) CS with cyclic
correlation [c.f. (17.2) and (17.5)]
C
xx

(α; τ) =
c
ss
(τ )
4

δ(α+ 2ω
0
)e

0
τ
+ δ(α− 2ω
0
)e
−jω
0
τ

+

c
ss
(τ )
2
cos(ω
0
τ)+ c
vv
(τ )


δ(α) .
(17.6)
The set of cycles is A
c
xx
(τ ) ={±2ω
0
, 0} provided that c
ss
(τ ) = 0 and c
vv
(τ ) = 0. The set A
c
xx
(τ )
is lag-dependent in the sense that some cycles may disappear while others may appear for different
τs. To illustrate the τ -dependence, let s(n) be an MA process of order q. Clearly, c
ss
(τ ) = 0 for
|τ| >q, and thus A
c
xx
(τ ) ={0} for |τ| >q.
The CS process in (17.3) is just one example of signals involving products and sums of stationary
processes such as s(n) with (almost) periodic deterministic sequences d(n), or, CS processes x(n).
For such signals, the following properties are useful:
Property 1 Finite sums and products of ACS signals are ACS. If x
i
(n) is CS with period P

i
, then for λ
i
constants, y
1
(n) :=

I
1
i=1
λ
i
x
i
(n) and y
2
(n) :=

I
2
i=1
λ
i
x
i
(n) are also CS. Unless cycle cancellations
occur among x
i
(n) components, the period of y
1

(n) and y
2
(n) equals the least common multiple of
the P
i
s. Similarly, finite sums and products of stationary processes with deterministic (almost) periodic
signals are also ACS processes.
As examples of random–deterministic mixtures, consider
x
1
(n) = s(n) + d(n) and x
2
(n) = s(n)d(n) ,
(17.7)
where s(n) is zero-mean, stationary, and d(n) is deterministic (almost) periodic with Fourier Series
coefficients D(α). Time-varying correlations are, respectively,
c
x
1
x
1
(n; τ)= c
ss
(τ ) + d(n)d(n + τ) and c
x
2
x
2
(n; τ)= c
ss

(τ )d(n)d(n + τ) .
(17.8)
Both are (almost) periodic in n, with cyclic correlations
C
x
1
x
1
(α; τ)= c
ss
(τ )δ(α) + D
2
(α; τ) and C
x
2
x
2
(α; τ)= c
ss
(τ )D
2
(α; τ) ,
(17.9)
where D
2
(α; τ) =

β
D(β)D(α − β)exp[j(α − β)τ], since the Fourier Series coefficients of
the product d(n)d(n + τ) are given by the convolution of each component’s coefficients in the

α-domain. To reiterate the dependence on τ, notice that if d(n) is a periodic ±1 sequence, then
c
x
2
x
2
(n; 0) = c
ss
(0)d
2
(n) = c
ss
(0), and hence periodicity disappears at τ = 0.
ACS signals appear often in nature with the underlying periodicity hidden, unknown, or inacces-
sible. In contrast, CS signals are often man-made and arise as a result of, e.g., oversampling (by a
known integer factor P) digital communication signals, or by sampling a spatial waveform with P
antennas (see also Section 17.4).
Both CS and ACS definitions could also be given in terms of the Fourier Transforms (τ → ω)
of c
xx
(n; τ) and C
xx
(α; τ), namely the time-varying and the cyclic spectra which we denote by
S
xx
(n; ω) and S
xx
(α; ω). Suppose c
xx
(n; τ)and C

xx
(α; τ)are absolutely summable w.r.t. τ for all
c

1999 by CRC Press LLC
n in Z and α
k
in A
c
xx
(τ ). We can then define and relate time-varying and cyclic spectra as follows:
S
xx
(n; ω) :=


τ=−∞
c
xx
(n; τ)e
−jωτ
=

α
k
∈A
s
xx
S
xx


k
; ω)e

k
n
(17.10)
S
xx

k
; ω) :=


τ=−∞
C
xx

k
; τ)e
−jωτ
= lim
N→∞
1
N
N−1

n=0
S
xx

(n; ω)e
−jα
k
n
.
(17.11)
Absolute summability w.r.t. τ implies vanishing memory as the lag separation increases, and many
real life signals satisfy these so called mixing conditions [5, Ch. 2]. Power signals are not absolutely
summable, but it is possible to define cyclic spectra equivalently [for real-valued x(n)]as
S
xx

k
; ω) := lim
N→∞
1
N
E{X
N
(ω)X
N

k
− ω)} ,X
N
(ω) :=
N−1

n=0
x(n)e

−jωn
.
(17.12)
If x(n) is complex ACS, then one also needs
¯
S
xx

k
; ω) := lim
N→∞
N
−1
E{X

N
(−ω) X
N

k
− ω)}.
Both S
xx
and
¯
S
xx
reveal presence of spectral correlation. This must be contrasted to stationary
processeswhose spectral components, X
N


1
), X
N

2
)are knownto be asymptotically uncorrelated
unless |ω
1
± ω
2
|=0 (mod 2π)[5, Ch. 4]. Specifically, we have from (17.12) that:
Property 2 If x(n) is ACS or CS, the N-point Fourier transform X
N

1
) is correlated with X
N

2
) for

1
± ω
2
|=α
k
(mod 2π), and α
k
∈ A

s
xx
.
Before dwelling further on spectral characterization of ACS processes, it is useful to note the diver-
sity of tools available for processing. Stationary signals are analyzed with time-invariant correlations
(lag-domain analysis), or with power spectral densities (frequency-domain analysis). However, CS,
ACS, and generally nonstationary signals entail four variables: (n, τ, α, ω) :=(time, lag, cycle, fre-
quency). Grouping two variables at a time, four domains of analysis become available and their
relationship is summarized in Fig. 17.1. Note that pairs (n; τ)↔ (α; τ),or,(n; ω) ↔ (α; ω),have
τ or ω fixed and are Fourier Series pairs; whereas (n; τ) ↔ (n; ω),or,(α; τ) ↔ (α; ω),haven or
α fixed and are related by Fourier Transforms. Further insight on the links between stationary and
FIGURE 17.1: Four domains for analyzing cyclostationary signals.
cyclostationary processes is gained through the uniform shift (or phase) randomization concept. Let
c

1999 by CRC Press LLC
x(n) be CS with period P , and define y(n) := x(n+ θ),whereθ is uniformly distributed in [0,P)
and independent of x(n). With c
yy
(n; τ):= E
θ
{E
x
[x(n+ θ)x(n+ τ + θ)]}, we find:
c
yy
(n; τ)=
1
P
P−1


p=0
c
xx
(p; τ):= C
xx
(0; τ):= c
yy
(τ ) ,
(17.13)
where the first equality follows because θ is uniform and the second uses the CS definition in (17.1).
Noting that c
yy
is not a function of n, we have established (see also [15, 38]):
Property 3 ACSprocessx(n) can be mapped to a stationary process y(n) using a shift θ, uniformly
distributed over its period, and the transformation y(n) := x(n+ θ).
Such a mapping is often used with harmonic signals; e.g., x(n) = Aexp[j(2π n/P + θ)]+v(n) is
according to Property2aCSsignal, but can be stationarized by uniform phase randomization. An
alternative trick for stationarizing signals which involve complex harmonics is conjugation. Indeed,
c
xx∗
(n; τ) = A
2
exp(−j2π τ/P ) + c
vv
(τ ) is not a function of n — but why deal with CS or ACS
processes if conjugation or phase randomization can render them stationary?
Revisiting Case 2 of Example 17.1 offers a partial answer when the goal is to estimate the frequency
ω
0

. Phase randomization of x(n) in (17.3) leads to a stationary y(n) with correlation found by
substituting α = 0 in (17.6). This leads to c
yy
(τ ) = (1/2)c
ss
(τ ) cos(ω
0
τ)+ c
vv
(τ ), and shows that
if s(n) has multiple spectral peaks, or if s(n) is broadband, then multiple peaks or smearing of the
spectral peak hamper estimation of ω
0
(in fact, it is impossible to estimate ω
0
from the spectrum of
y(n) if s(n) is white). In contrast, picking the peak of C
xx
(α; τ)in (17.6) yields ω
0
, provided that
ω
0
∈ (0,π) so that spectral folding is prevented [33]. Equation (17.13) provides a more general
answer. Phase randomization restricts a CS process only to one cycle, namely α = 0. In other words,
the cyclic correlation C
xx
(α; τ) contains the “stationarized correlation” C
xx
(0; τ) and additional

information in cycles α = 0.
Since CS and ACS processes form a superset of stationary ones, it is useful to know how a stationary
process can be viewed as a CS process. Note that if x(n) is stationary, then c
xx
(n; τ)= c
xx
(τ ) and
on using (17.2) and (17.5) we find:
C
xx
(α; τ)= c
xx
(τ )

lim
N→∞
1
N
N−1

n=0
e
−jαn

= c
xx
(τ )δ(α) .
(17.14)
Intuitively, (17.14) is justified if we think that stationarity reflects “zero time-variation” in the corre-
lation c

xx
(τ ). Formally, (17.14) implies:
Property 4 Stationary processes can be viewed as ACS or CS with cyclic correlation C
xx
(α; τ) =
c
xx
(τ )δ(α).
Separation of information bearing ACS signals from stationary ones (e.g., noise) is desired in many
applications and can be achieved based on Property 4 by excluding the cycle α = 0.
Next, it is of interest to view CS signals as special cases of general nonstationary processes
with 2-D correlation r
xx
(n
1
,n
2
) := E{x(n
1
)x(n
2
)}, and 2-D spectral densities S
xx

1

2
) :=
FT[r
xx

(n
1
,n
2
)] that are assumed to exist.
2
Two questions arise: What are the implications of pe-
riodicity in the (ω
1

2
) plane? and how does the cyclic spectra in (17.10) through (17.12) relate
to S
xx

1

2
)? The answers are summarized in Fig. 17.2, which illustrates that the support of CS
processes in the (ω
1

2
) plane consists of 2P − 1 parallel lines (with unity slope) intersecting the
axes at equidistant points 2π/P far apart from each other. More specifically, we have [34]:
2
Nonstationary processes with Fourier transformable 2-D correlations are called harmonizable processes.
c

1999 by CRC Press LLC

FIGURE 17.2: Support of 2-D spectrum S
xx

1

2
) for CS processes.
Property 5 A CS process with period P is a special case of a nonstationary (harmonizable) process with
2-D spectral density given by
S
xx

1

2
) =
P−1

k=−(P−1)
S
xx
(

P
k; ω
1

D

2

− ω
1
+

P
k) ,
(17.15)
where δ
D
denotes the delta of Dirac.
For stationary processes, only the k = 0 term survives in (17.15) and we obtain S
xx

1

2
) =
S
xx
(0; ω
1

D

2
−ω
1
); i.e., the spectral mass is concentrated on the diagonal of Fig. 17.2.The
well-structured spectral support for CS processes will be used to test for presence of cyclostationarity
and estimate the period P . Furthermore, the superposition of lines parallel to the diagonal hints

towards representing CS processes as a superposition of stationary processes. Next we will examine
two such representations introduced by Gladysev [34] (see also [22, 38, 49], and [56]).
We can uniquely write n
0
= nP + i and express x(n
0
) = x(nP + i), where the remainder i takes
values0, 1,...,P−1.Foreachi, define the subprocess x
i
(n) := x(nP +i). In multirate processing,
the P × 1 vector x(n) := [x
0
(n)...x
P−1
(n)]

constitutes the so-called polyphase decomposition of
x(n) [51, Ch. 12]. As shown in Fig. 17.3,eachx
i
(n) is formed by downsampling an advanced copy
of x(n). On the other hand, combining upsampled and delayed x
i
(n)s, we can synthesize the CS
process as:
x(n) =
P−1

i=0

l

x
i
(l)δ(n − i − lP) .
(17.16)
We maintain that subprocesses{x
i
(n)}
P−1
i=0
are (jointly) stationary, and thus x(n) is vector stationary.
Suppose for simplicity that E{x(n)}=0, and start with E{x
i
1
(n)x
i
2
(n+τ)}=E{x(nP +i
1
)x(nP +
τP + i
2
)}:=c
xx
(i
1
+ nP; i
2
− i
1
+ τP). Because x(n) is CS, we can drop nP and c

xx
becomes
independent of n establishing that x
i
1
(n), x
i
2
(n) are (jointly) stationary with correlation:
c
x
i
1
x
i
2
(τ ) = c
xx
(i
1
; i
2
− i
1
+ τP) , i
1
,i
2
∈[0,P− 1] .
(17.17)

c

1999 by CRC Press LLC
FIGURE 17.3: Representation 1: (a) analysis, (b) synthesis.
Using (17.17), it can be shown that auto- and cross-spectra of x
i
1
(n), x
i
2
(n) can be expressed in terms
of the cyclic spectra of x(n) as [56],
S
x
i
1
x
i
2
(ω) =
1
P
P−1

k
1
=0
P−1

k

2
=0
S
xx


P
k
1
;
ω − 2πk
2
P

e
j[(
ω−2πk
2
P
)(i
2
−i
1
)+

P
k
1
i
1

]
.
(17.18)
To invert (17.18), we Fourier transform (17.16) and use (17.12) to obtain [for x(n) real]
S
xx
(

P
k; ω) =
P−1

i
1
=0
P−1

i
2
=0
S
x
i
1
x
i
2
(ω)e
jω(i
2

−i
1
)
e
−j

P
ki
2
.
(17.19)
Basedon(17.16) through (17.19), we infer that cyclostationary signals with period P can be analyzed
as stationary P × 1 multichannel processes and vice versa. In summary, we have:
Representation 1 (Decimated Components) CS process x(n) can be represented as a P-variate sta-
tionary multichannel process x(n) with components x
i
(n) = x(nP + i), i = 0, 1,...,P − 1. Cyclic
spectra and stationary auto- and cross-spectra are related as in (17.18) and (17.19).
An alternative means of decomposing a CS process into stationary components is by splitting the
(−π, π] spectral support of X
N
(ω) into bands each of width 2π/P [22]. As shown in Fig. 17.4, this
can be accomplished bypassing modulated copies of x(n) through an ideal low-pass filter H
0
(ω) with
spectral support (−π/P, π/P]. The resulting subprocesses ¯x
m
(n) can be shifted up in frequency
and recombined to synthesize the CS process as: x(n) =


P−1
m=0
¯x
m
(n) exp(−j2πmn/P ). Within
each band, frequencies are separated by less than 2π/P and according to Property 2, there is no
correlation between spectral components
¯
X
m,N

1
) and
¯
X
m,N

2
);hence,¯x
m
(n) components are
stationary with auto- and cross-spectra having nonzero support over −π/P < ω < π/P.Theyare
related with the cyclic spectra as follows:
S
¯x
m
1
¯x
m
2

(ω) = S
xx


P
(m
1
− m
2
); ω +

P
m
1

, |ω| <
π
P
.
(17.20)
Equation (17.20) suggests that cyclostationary signal analysis is linked with stationary subband pro-
cessing.
Representation 2 (Subband Components) CS process x(n) can be represented as a superposition of P
stationary narrowband subprocesses according to: x(n) =

P−1
m=0
¯x
m
(n) exp(−j2πmn/P ). Auto- and

c

1999 by CRC Press LLC
FIGURE 17.4: Representation 2: (a) analysis, (b) synthesis.
cross-spectra of ¯x
m
(n) can be found from the cyclic spectra of x(n) as in (17.20).
Because ideal low-pass filters cannot be designed, the subband decomposition seems less practical.
However, using Representation 1 and exploiting results from uniform DFT filter banks, it is possible
using FIR low-pass filters to obtain stationary subband components (see e.g., [51, Ch. 12]). We will
not pursue this approach further, but Representation 1 will be used next for estimating time-varying
correlations of CS processes based on a single data record.
17.3 Estimation, Time-Frequency Links, Testing
The time-varying and cyclic quantities introduced in (17.1), (17.2), and (17.10) through (17.12),
entail ideal expectations (i.e., ensemble averages) and unless reliable estimators can be devised from
finite (and often noisy) data records, their usefulness in practice is questionable. For stationary
processes with (at least asymptotically) vanishing memory,
3
sample correlations and spectral density
estimators converge to their ensembles as the record length N →∞. Constructing reliable (i.e.,
consistent) estimators for nonstationary processes, however, is challenging and generally impossible.
Indeed, capturing time-variations calls for short observation windows, whereas variance reduction
demands long records for sample averages to converge to their ensembles.
Fortunately, ACS and CS signals belong to the class of processes with “well-structured” time-
variations that under suitable mixing conditions allow consistent single record estimators. The key
is to note that although c
xx
(n; τ) and S
xx
(n; ω) are time-varying, they are expressed in terms of

cyclic quantities, C
xx

k
; τ)and S
xx

k
; ω), which are time-invariant. Indeed, in (17.2) and (17.10)
time-variation is assigned to the Fourier basis.
3
Well-separated samples of such processes are asymptotically independent. Sufficient (so-called mixing) conditions
include absolute summability of cumulants and are satisfied by many real life signals (see [5, 12, Ch. 2]).
c

1999 by CRC Press LLC
17.3.1 Estimating Cyclic Statistics
Firstwewill considerACSprocesseswith known cycles α
k
. Simplerestimatorsfor CSprocessesandcy-
cleestimationmethods will be discussedlaterin the section. If x(n) has nonzeromean, weestimatethe
cyclicmean as in Example17.1 using the normalized DFT:
ˆ
C
xx

k
) = N
−1


N−1
n=0
x(n) exp(−jα
k
n).
If the set of cycles is finite, we estimate the time-varying mean as: ˆc
xx
(n) =

α
k
ˆ
C
xx

k
) exp(j α
k
n).
Similarly, for zero-mean ACS processes we estimate first cyclic and then time-varying correlations
using:
ˆ
C
xx

k
; τ) =
1
N
N−1


n=0
x(n)x(n + τ)e
−jα
k
n
,
ˆc
xx
(n; τ) =

α
k
∈A
c
xx
(τ )
ˆ
C
xx

k
; τ)e

k
n
.
(17.21)
Note that
ˆ

C
xx
can be computed efficiently using the FFT of the product x(n)x(n + τ).
For cyclic spectral estimation, two options are available: (1) smoothed cyclic periodograms and
(2) smoothed cyclic correlograms. The first is motivated by (17.12) and smooths the cyclic peri-
odogram, I
xx
(α; ω) := N
−1
X
N
(ω)X
N
(α − ω), using a frequency-domain window W(ω).The
second follows (17.2) and Fourier transforms
ˆ
C
xx
(α; τ) after smoothing it by a lag-window w(τ)
with support τ ∈[−M,M]. Either one of the resulting estimates:
ˆ
S
(i)
xx
(α; ω) =
1
N
N−1

n=0

W

ω −

N
n

I
xx

α;

N
n

,
ˆ
S
(ii)
xx
(α; ω) =
M

τ=−M
w(τ)
ˆ
C
xx
(α; τ)e
−jωτ

,
(17.22)
can be used to obtain time-varying spectral estimates; e.g., using
ˆ
S
(i)
xx
(α; ω), we estimate S
xx
(n; ω)
as:
ˆ
S
(i)
xx
(n; ω) =

α
k
∈A
s
xx
ˆ
S
(i)
xx

k
; ω)e


k
n
.
(17.23)
Estimates (17.21) through (17.23) apply to ACS (and hence CS) processes with a finite number of
known cycles, and rely on the following steps: (1) estimate the time-invariant (or “stationary”)
quantities by dropping limits and expectations from the corresponding cyclic definitions, and (2) use
the cyclic estimates to obtain time-varying estimates relying on the Fourier synthesis Eqs. (17.2)
and (17.10). Selection of the windows in (17.22), variance expressions, consistency, and asymptotic
normality of the estimators in (17.21) through (17.23) under mixing conditions can be found in [11,
12, 24, 39] and references therein.
When x(n) is CS with knowninteger period P, estimationof time-varying correlationsand spectra
becomes easier. Recall that thanks to Representations 1 and 2, not only c
xx
(n; τ)and S
xx
(n; ω), but
the process x(n) itself can be analyzed into P stationary components. Starting with (17.16), it can
be shown that c
xx
(i; τ)= c
x
i
x
i+τ
(0),wherei = 0, 1,...,P− 1 and subscript i + τ is understood
mod(P ). Because the subprocesses x
i
(n) and x
i+τ

(n) are stationary, their cross-covariances can be
estimated consistently using sample averaging; hence, the time-varying correlation can be estimated
as:
ˆc
xx
(i; τ)=ˆc
x
i
x
i+τ
(0) =
1
[N/P]
[N/P]−1

n=0
x(nP + i)x(nP + i + τ) ,
(17.24)
c

1999 by CRC Press LLC
where the integerpart[N/P] denotes the number of samples per subprocess x
i
(n), and the last equal-
ity follows from the definition of x
i
(n) in Representation 1. Similarly, the time-varying periodogram
can be estimated using: I
xx
(n; ω) = P

−1

P−1
k=0
X
P
(ω)X
P
(2πk/P − ω) exp(−j2πkn/P ), and
then smoothed to obtain a consistent estimate of S
xx
(n; ω).
17.3.2 Links with Time-Frequency Representations
Consistency (and hence reliability) of single record estimates is a notable difference between cyclo-
stationary and time-frequency signal analyses. Short-time Fourier transforms, the Wigner-Ville,
and derivative representations are valuable exploratory (and especially graphical) tools for analyz-
ing nonstationary signals. They promise applicability on general nonstationarities, but unless slow
variations are present and multiple independent data records are available, their usefulness in es-
timation tasks is rather limited. In contrast, ACS analysis deals with a specific type of structured
variation, namely (almost) periodicity, but allows for rapid variations and consistent single record
sample estimates. Intuitively speaking, cyclostationarity provides within a single record, multiple
periods that can be viewed as “multiple realizations.” Interestingly, for ACS processes there is a close
relationship between the normalized asymmetric ambiguity function A(α; τ)[37], and the sample
cyclic correlation in (17.21):
N
ˆ
C
xx
(α; τ)= A(α; τ):=
N−1


n=0
x(n)x(n + τ)e
−jαn
.
(17.25)
Similarly, one may associate the Wigner-Ville with the time-varying periodogram I
xx
(n; ω) =

N−1
τ=−(N−1)
x(n) x(n+τ)exp(−jωτ). In fact, the aforementioned equivalencesand the consistency
results of [12] establish that ambiguity and Wigner-Ville processing of ACS signals is reliable even
when only a single data record is available. The following example uses a chirp signal to stress this
point and shows how some of our sample estimates can be extended to complex processes.
EXAMPLE 17.2:
Chirp in multiplicative and additive noise
Consider x(n) = s(n) exp(j ω
0
n
2
) + v(n),wheres(n), v(n), are zero mean, stationary, and
mutually independent; c
xx
(n; τ)is nonperiodic for almost every ω
0
, and hence x(n) is not (second-
order)ACS.Evenwhen E{s(n)}=0, E{x(n)}isalsononperiodic, implying that x(n)isnotfirst-order
ACS either. However,

˜c
xx∗
(n; τ) := c
xx∗
(n + τ;−2τ):= E{x(n+ τ)x

(n − τ)}
= c
ss
(2τ)exp(j4ω
0
τn)+ c
vv∗
(2τ) ,
(17.26)
exhibits (almost) periodicity and its cycliccorrelation is given by:
˜
C
xx∗
(α; τ)= c
ss
(τ )δ(α−4ω
0
τ)+
c
vv∗
(2τ)δ(α). Assuming c
ss
(τ ) = 0, the latter allows evaluation of ω
0

by picking the peak of the
sample cyclic correlation magnitude evaluated at, e.g., τ = 1, as follows:
ˆω
0
=−
1
4
arg max
α=0
|
ˆ
˜
C
xx∗
(α; 1)| ,
ˆ
˜
C
xx∗
(α; τ) =
1
N
N−1

n=0
x(n+ τ)x

(n − τ)e
−jαn
.

(17.27)
The
ˆ
˜
C
xx∗
(α; τ)estimate in (17.27) is nothing but the symmetric ambiguity function. Because x(n)
is ACS,
ˆ
˜
C
xx∗
can be shown to be consistent. This provides yet one more reason for the success of
c

1999 by CRC Press LLC
time-frequency representations with chirp signals. Interestingly, (17.27) shows that exploitation of
cyclostationarity allows not only for additive noise tolerance [by avoiding the α = 0 cyclein(17.27)],
but also permits parameter estimation of chirps modulated by stationary multiplicative noise s(n).
17.3.3 Testing for Cyclostationarity
In certain applications involving man-made (e.g., communication) signals, presence of cyclostation-
arity and knowledge of the cycles is assured by design (e.g., baud rates or oversampling factors). In
other cases, however, only a time series {x(n)}
N−1
n=0
is given and two questions arise: How does one
detect cyclostationarity, and if x(n) is confirmed to be CS of a certain order, how does one estimate
the cycles present? The former is addressed by testing hypotheses of nonzero
ˆ
C

x

k
),
ˆ
C
xx

k
; τ)or
ˆ
S
xx

k
; ω) over a fine cycle-frequency grid obtained by sufficient zero-padding prior to taking the
FFT.
Specifically, to test whether x(n) exhibits cyclostationarity in {
ˆ
C
xx
(α; τ
l
)}
L
l=1
for at least one lag,
we form the (2L+ 1) × 1 vector ˆc
xx
(α) := [

ˆ
C
R
xx
(α; τ
1
)...
ˆ
C
R
xx
(α; τ
L
);
ˆ
C
I
xx
(α; τ
1
)...
ˆ
C
I
xx
(α; τ
L
)]

where superscript R(I) denotes real (imaginary) part. Similarly, we define the ensemble vector

c
xx
(α) and the error e
xx
(α) := ˆc
xx
(α) − c
xx
(α).ForN large, it is known that

N e
xx
(α) is
Gaussian with pdf N (0,
c
). An estimate
ˆ

c
of the asymptotic covariance can be computed from
the data [12]. If α is not a cycle for all {τ
l
}
L
l=1
, then c
xx
(α) ≡ 0, e
xx
(α) =ˆc

xx
(α) will have zero
mean, and
ˆ
D
2c
(α) := ˆc

xx
(α)
ˆ


c
(α)ˆc
xx
(α) will be central chi-square. For a given false-alarm rate,
we find from χ
2
tables a threshold  and test [10]
H
0
:
ˆ
D
c
xx
(α) ≥  ⇒ α ∈ A
c
xx

vs. H
1
:
ˆ
D
c
xx
(α)< ⇒ α/∈ A
c
xx
.
(17.28)
Alternate 2D contour plots revealing presence of spectral correlation rely on (17.15) and more
specifically on its normalized version (coherence or correlation coefficient) estimated as [40]
ρ
xx

1

2
) :=
1
M

M−1
m=0
| X
N

1

+
2πm
M
)X

N

2
+
2πm
M
) |
2
1
M

M−1
m=0
| X
N

1
+
2πm
M
) |
2
1
M


M−1
m=0
| X
N

2
+
2πm
M
) |
2
.
(17.29)
Plots of ρ
xx

1

2
) with the empirical thresholds discussed in [40] are valuable tools not only for
cycle detection and estimation of CS signals but even for general nonstationary processes exhibiting
partial (e.g., “transient” lag- or frequency-dependent) cyclostationarity.
EXAMPLE 17.3:
Cyclostationarity test
Consider x(n) = s
1
(n) cos(π n/8) + s
2
(n) cos(π n/4) with s
1

(n), s
2
(n), and v(n) zero-mean,
Gaussian, and mutually independent. To test for cyclostationarity and retrieve the possible periods
present, N = 2, 048 samples were generated; s
1
(n) and s
2
(n) were simulated as AR(1) with variances
σ
2
s
1
= σ
2
s
2
= 2, while v(n) was white with variance σ
2
v
= 0.1. Figure 17.5a shows |
ˆ
C
xx
(α; 0)|
peaking at α =±2(π/8),±2(π/4), 0 as expected, while Fig. 17.5b depicts ρ
xx

1


2
) computed as
in (17.29) with M = 64. The parallel lines in Fig. 17.5b are seen at|ω
1
−ω
2
|=0,π/8,π/4 revealing
the periods present. One can easily verify from (17.11) that C
xx
(α; 0) = (2π)
−1

π
−π
S
xx
(α; ω)dω.
It also follows from (17.15) that S
xx
(α; ω) = S
xx

1
= ω, ω
2
= ω − α); thus, C
xx
(α; 0) =
(2π)
−1


π
−π
S
xx
(ω, ω− α)dω, and for each α, we can view Fig. 17.5a as the (normalized) integral (or
projection) of Fig. 17.5b along each parallel line [40]. Although |
ˆ
C
xx
(α; 0)| is simpler to compute
using the FFT of x
2
(n), ρ
xx

1

2
) is generally more informative.
Because cyclostationarity is lag-dependent, as an alternative to ρ
xx

1

2
) one can also plot
|
ˆ
C

xx
(α; τ)| or |
ˆ
S
xx
(α; ω)| for all τ or ω. Figures 17.6 and 17.7 show perspective and contour plots
c

1999 by CRC Press LLC

×