Tải bản đầy đủ (.pdf) (4 trang)

Tài liệu SEC 05 doc

Bạn đang xem bản rút gọn của tài liệu. Xem và tải ngay bản đầy đủ của tài liệu tại đây (31.83 KB, 4 trang )

V
StatisticalSignal
Processing
GeorgiosB.Giannakis
UniversityofVirgina
12OverviewofStatisticalSignalProcessing CharlesW.Therrien
DiscreteRandomSignals

LinearTransformations

RepresentationofSignalsasRandomVectors

FundamentalsofEstimation
13SignalDetectionandClassification AlfredHero
Introduction

SignalDetection

SignalClassification

TheLinearMultivariateGaussianModel

TemporalSignalsinGaussianNoise

Spatio-TemporalSignals

SignalClassification
14SpectrumEstimationandModeling PetarM.Djuri´candStevenM.Kay
Introduction

ImportantNotionsandDefinitions



TheProblemofPowerSpectrumEstimation

NonparametricSpectrumEstimation

ParametricSpectrumEstimation

RecentDevelopments
15EstimationTheoryandAlgorithms:FromGausstoWienertoKalman JerryM.
Mendel
Introduction

Least-SquaresEstimation

PropertiesofEstimators

BestLinearUnbiasedEsti-
mation

Maximum-LikelihoodEstimation

Mean-SquaredEstimationofRandomParameters

MaximumAPosterioriEstimationofRandomParameters

TheBasicState-VariableModel

State
EstimationfortheBasicState-VariableModel


DigitalWienerFiltering

LinearPredictioninDSP,
andKalmanFiltering

IteratedLeastSquares

ExtendedKalmanFilter
16Validation,Testing,andNoiseModeling JitendraK.Tugnait
Introduction

Gaussianity,Linearity,andStationarityTests

OrderSelection,ModelValidation,
andConfidenceIntervals

NoiseModeling

ConcludingRemarks
17CyclostationarySignalAnalysis GeorgiosB.Giannakis
Introduction

Definitions,Properties,Representations

Estimation,Time-FrequencyLinks,Test-
ing

CSSignalsandCS-InducingOperations

ApplicationAreas


ConcludingRemarks
S
TATISTICALSIGNALPROCESSINGdealswithrandomsignals,theiracquisition,theirprop-
erties,theirtransformationbysystemoperators,andtheircharacterizationinthetimeand
frequencydomains.Thegoalistoextractpertinentinformationabouttheunderlyingmech-
anismsthatgeneratethemortransformthem.Theareaisgroundedinthetheoriesofsignalsand
systems,randomvariablesandstochasticprocesses,detectionandestimation,andmathematical
statistics.Randomsignalsaretemporalorspatialandcanbederivedfromman-made(e.g.,binary
communicationsignals)ornatural(e.g.,thermalnoiseinasensoryarray)sources.Theycanbe
c

1999byCRCPressLLC
continuous or discrete in their amplitude or index, but no exact expression describes their evolution.
Signals are often described statistically when the engineer has incomplete knowledge about their
description or origin. In these cases, statistical descriptors are used to characterize one’s degree of
knowledge (or ignorance) about the randomness. Especially interesting are those signals (e.g., sta-
tionary and ergodic) that can be described using deterministic quantitiescomputablefrom finitedata
records. Applications of statistical signal processing algorithms to random signals are omnipresent
in science and engineering in such areas as speech, seismic, imaging, sonar, radar, sensor arrays,
communications, controls, manufacturing, atmospheric sciences, econometrics, and medicine, just
to name a few. This chapter deals with the fundamentals of statistical signal processing, including
some interesting topics that deviate from traditional assumptions. The focus is on discrete index
random signals (i.e., time series) with possibly continuous-valuedamplitudes. The reason is twofold:
measurements are oftenmade in discretefashion (e.g., monthly temperature data); and continuously
recorded signals (e.g., speech data) are often sampled for parsimonious representation and efficient
processing by computers.
The first chapter of the section, written by Charles Therrien, reviews definitions, characterization,
and estimation problems entailing random signals. The important notions outlined are stationar-
ity, independence, ergodicity, and Gaussianity. The basic operations involve correlations, spectral

densities, and linear time-invariant transformations. Stationarity reflects invariance of a signal’s
statistical description with index shifts. Absence (or presence) of relationships among samples of a
signal at different points is conveyed by the notion of (in)dependence, which provides information
about the signal’s dynamical behavior and memory as it evolves in time or space. Ergodicity allows
computation of statistical descriptors from finite data records. In increasing order of computational
complexity, descriptors include the mean (or average) value of the signal, the autocorrelation, and
higher than second-order correlations which reflect relations among two or more signal samples.
Complete statistical characterization of random signals is provided by probability density and dis-
tribution functions. Gaussianity describes probabilistically a particular distribution of signal values
which is characterized completely by its first- and second-order statistics. It is often encountered in
practice because, thanks to the central limit theorem, averaging a sufficient number of random signal
values (an operation often performed by, e.g., narrowband filtering) yields outputs which are (at least
approximately) distributed according to the Gaussian probability law. Frequency-domain statistical
descriptors inherit all the merits of deterministic Fourier transforms and can be computed efficiently
using the fast Fourier transform. The standard tool here is the power spectral density which describes
how average power (or signal variance) is distributed across frequencies; but polyspectral densities
are also important for capturing distributions of higher-order signal moments across frequencies.
Random input signals passing through linear systems yield random outputs. Input-output auto-
and cross-correlations and spectra characterize not only the random signals themselves but also the
transformation induced by the underlying system.
Many random signals as well as systems with random inputs and outputs possess finite degrees
of freedom and can thus be modeled using finite parameters. Depending on a priori knowledge,
one estimates parameters from a given data record, treating them either as random or deterministic.
Various approaches become available by adopting different figures of merit (estimation criteria).
Those outlined in this chapter include the maximum likelihood, minimum variance, and least-
squares criteria for deterministic parameters. Random parameters are estimated using the maximum
a posteriori and Bayes criteria. Unbiasedness, consistency, and efficiency are important properties
of estimators which, together with performance bounds and computational complexity, guide the
engineer to select the proper criterion and estimation algorithm.
While estimation algorithms seek values in the continuum of a parameter set, the need arises often

in signal processing to classify parameters or waveforms as one or another of prespecified classes.
Decision making with two classes is sought frequently in practice, including as a special case the
simpler problem of detecting the presence or absence of an information-bearing signal observed
c

1999 by CRC Press LLC
in noise. Such signal detection and classification problems along with the associated theory and
practice of hypotheses testing is the subject of the second chapter written by Alfred Hero. The
resulting strategies are designed to minimize the average number of decision errors. Additional
performance measures include receiver operating characteristics, signal-to-noise ratios, probabilities
of detection (or correct classification), false alarm (or misclassification) rates, and likelihood ratios.
Both temporal and spatio-temporal signals are considered, focusing on linear single- and multi-
variate Gaussian models. Trade-offs include complexity versus optimality, off-line versus real time
processing, and separate versus simultaneous detection and estimation for signal models containing
unknown parameters.
Parametric and nonparametric methods are described in the third chapter, written by Petar Djuri
´
c
and Steven Kay, for the basic problem of spectral estimation. Estimates of the power spectral density
have been used over the last century and continueto be of interestin numerous applications involving
retrieval of hidden periodicities, signal modeling, and time series analysisproblems. Starting with the
periodogram (normalized square magnitude of the data Fourier transform), its modifications with
smoothing windows, and moving on to the more recent minimum variance and multiple window
approaches, the nonparametric methods described here constitute the first step used to characterize
the spectral content of stationary stochastic signals. Factors dictating the designer’s choice include
computational complexity, bias-variance, and resolution trade-offs. For data adequately described
by a parametric model, such as the auto-regressive (AR), moving-average (MA), or ARMA model,
spectral analysis reduces to estimating the model parameters. Such a data reduction step achieved by
modeling offers parsimony and increases resolution and accuracy, provided that the model and its
order (number of parameters) fit well the available time series. Processes containing harmonic tones

(frequencies) have line spectra, and the task of estimating frequencies appears in diverse applications
in science and engineering. The methods presented here include both the traditional periodogram
as well as modern subspace approaches such as the MUSIC and its derivatives.
Estimation from discrete-time observations is the theme of the next chapter, written by Jerry
Mendel. The unifying viewpoint treats both parameter and waveform (or signal) estimation from
the perspective of minimizing the averaged square error between observations and input-output
or state variable signal models. Starting from the traditional linear least-squares formulation, the
exposition includes weighted and recursive forms, their properties, and optimality conditions for
estimating deterministic parameters as well as their minimum mean-square error and maximum
a posteriori counterparts for estimating random parameters. Waveform estimation, on the other
hand, includes not only input-output signals but also state space vectors in linear and nonlinear
state variable models. Prediction, smoothing, and the celebrated Kalman filtering problems are
outlined in this framework and relationships are highlighted with the Wiener filtering formulation.
Nonlinear least-squares and iterative minimization schemes are discussed for problems where the
desired parameters are nonlinearly related with the data. Nonlinear equations can oftenbe linearized,
and the extended Kalman filter is described briefly for estimating nonlinear state variable models.
Minimizing the mean-square error criterion leads to the basic orthogonality principle which appears
in both parameter and waveform estimation problems. Generally speaking, the mean-square error
criterion possesses rather universaloptimality when the underlying models arelinear and therandom
data involved are Gaussian distributed.
Before accessing applicability and optimality of estimation algorithms in real life applications,
modelsneed tobe checkedforlinearity, andthe randomsignals involved needtotestedfor Gaussianity
and stationarity. Performance bounds and parameter confidence intervals must also be derived in
order to evaluate the fit of the model. Finally, diagnostic tools for model falsification are needed to
validate that the chosen model represents faithfully the underlying physical system. These important
issues are discussed in the chapter writtenby Jitendra Tugnait. Stationarity, Gaussianity, and linearity
testsare presentedin a hypothesis-testingframework relying upon second-and higher-orderstatistics
of the data. Tests are also described for estimating the number of parameters (or degrees of freedom)
c


1999 by CRC Press LLC
necessary for parsimonious modeling. Model validation is accomplished by checking for whiteness
and independence of the error processes formed by subtracting model data from measureddata. Tests
may declare signal or noise data as non-Gaussian and/or nonstationary. The non-Gaussian models
outlined here include the generalized Gaussian, Middleton’s class, and the stable noise distribution
models.
As for nonstationary signals and time-varying systems, detection and estimation tasks become
more challenging and solutions are not possible in the most general case. However, structured
nonstationarities such as those entailing periodic and almost periodic variations in their statistical
descriptors are tractable. The resulting random signals are called (almost) cyclostationary and their
analysis is the theme of the final chapter in this section, which I have written. The exposition starts
with motivation and background material including links between cyclostationary signals and mul-
tivariate stationary processes, time-frequency representations, and multirate operators. Examples
of cyclostationary signals and cyclostationarity-inducing operations are also described along with
applications to signal processing and communication problems with emphasis on signal separation
and channel equalization.
Moderntheoreticaldirectionsin the fieldappear toward non-Gaussian, nonstationary, andnonlin-
ear signal models. Advanced statistical signal processing tools (algorithms, software, and hardware)
are of interest in current applications such as manufacturing, biomedicine, multimedia services, and
wireless communications. Scientists and engineers will continue to search and exploit determinism
in signals that they create or encounter, and find it convenient to model, as random.
c

1999 by CRC Press LLC

Tài liệu bạn tìm kiếm đã sẵn sàng tải về

Tải bản đầy đủ ngay
×