HANDBOOK
OF
STOCHASTIC
ANALYSIS
AND
APPLICATIONS
STATISTICS:
Textbooks
and
Monographs
D.
B.
Owen, Founding
Editor,
1972-1991
1.
The
Generalized
Jackknife
Statistic,
H.
L
Gray
and
W.
R.
Schucany
2.
Multivariate
Analysis,
Anant
M.
Kshirsagar
3.
Statistics
and
Society,
Walter
T.
Federer
4.
Multivariate Analysis:
A
Selected
and
Abstracted Bibliography,
1957-1972,
Kocheriakota
Subrahmaniam
and
Kathleen
Subrahmaniam
5.
Design
of
Experiments:
A
Realistic Approach,
Virgil
L.
Anderson
and
Robert
A.
McLean
6.
Statistical
and
Mathematical Aspects
of
Pollution Problems,
John
W.
Pratt
7.
Introduction
to
Probability
and
Statistics
(in two
parts), Part
I:
Probability; Part
II:
Statistics,
Narayan
C.
Gin
8.
Statistical Theory
of the
Analysis
of
Experimental Designs,
J.
Ogawa
9.
Statistical Techniques
in
Simulation
(in two
parts),
Jack
P. C.
Kleijnen
10.
Data
Quality Control
and
Editing,
Joseph
I.
Naus
11.
Cost
of
Living
Index
Numbers:
Practice, Precision,
and
Theory,
Kali
S.
Banerjee
12.
Weighing Designs:
For
Chemistry, Medicine, Economics, Operations Research,
Statistics,
Kali
S.
Banerjee
13.
The
Search
for
Oil: Some Statistical Methods
and
Techniques,
edited
by D. B.
Owen
14.
Sample Size Choice: Charts
for
Experiments with Linear Models,
Robert
E.
Odeh
and
Martin
Fox
15.
Statistical Methods
for
Engineers
and
Scientists,
Robert
M.
Bethea,
Benjamin
S.
Duran,
and
Thomas
L
Bouillon
16.
Statistical Quality Control Methods,
Irving
W.
Burr
17.
On the
History
of
Statistics
and
Probability,
edited
by D. B.
Owen
18.
Econometrics,
Peter
Schmidt
19.
Sufficient Statistics: Selected Contributions,
Vasant
S.
Huzurbazar
(edited
by
Anant
M.
Kshirsagar)
20.
Handbook
of
Statistical
Distributions,
Jagdish
K.
Pate/,
C. H.
Kapadia,
and D. B.
Owen
21.
Case Studies
in
Sample Design,
A. C.
Rosander
22.
Pocket Book
of
Statistical Tables,
compiled
by R. E.
Odeh,
D. B.
Owen,
Z.
W.
Bimbaum,
and L.
Fisher
23. The
Information
in
Contingency Tables,
D. V.
Gokhale
and
Solomon
Kullback
24.
Statistical Analysis
of
Reliability
and
Life-Testing Models: Theory
and
Methods,
Lee J.
Bain
25.
Elementary Statistical Quality Control,
Irving
W.
Bun-
26.
An
Introduction
to
Probability
and
Statistics Using BASIC,
Richard
A.
Groeneveld
27.
Basic Applied Statistics,
B. L
Raktoe
andJ.
J.
Hubert
28. A
Primer
in
Probability,
Kathleen Subrahmaniam
29.
Random Processes:
A
First Look,
R.
Syski
30.
Regression Methods:
A
Tool
for
Data Analysis,
Rudolf
J.
Freund
and
Paul
D.
Minton
31.
Randomization Tests,
Eugene
S.
Edgington
32.
Tables
for
Normal Tolerance Limits, Sampling Plans
and
Screening,
Robert
E.
Odeh
and D. B.
Owen
33.
Statistical Computing,
William
J.
Kennedy,
Jr.,
and
James
E.
Gentle
'
34.
Regression Analysis
and Its
Application:
A
Data-Oriented
Approach,
Richard
F.
Gunst
and
Robert
L.
Mason
|
35.
Scientific
Strategies
to
Save Your Life,
/.
D. J.
Brass
36.
Statistics
in the
Pharmaceutical Industry,
edited
by C.
Ralph
Buncher
and
Jia-Yeong
Tsay
37.
Sampling from
a
Finite Population,
J.
Hajek
38.
Statistical Modeling Techniques,
S. S.
Shapiro
and A. J.
Gross
39.
Statistical Theory
and
Inference
in
Research,
T. A.
Bancroft
and
C P.
Han
40.
Handbook
of the
Normal Distribution,
Jagdish
K.
Pate/
and
Campbell
B.
Read
41.
Recent Advances
in
Regression Methods,
Hrishikesh
D.
Vinod
and
Aman
Ullah
42.
Acceptance Sampling
in
Quality Control,
Edward
G.
Schilling
43. The
Randomized Clinical Trial
and
Therapeutic Decisions,
edited
by
Niels
Tygstrup,
John
M
Lachin,
and
Erik
Juhl
44.
Regression Analysis
of
Survival Data
in
Cancer Chemotherapy,
Walter
H.
Carter,
Jr.,
Galen
L.
Wampler,
and
Donald
M.
Stablein
45. A
Course
in
Linear Models,
Anant
M.
Kshirsagar
46.
Clinical Trials: Issues
and
Approaches,
edited
by
Stanley
H.
Shapiro
and
Thomas
H.
Louis
47.
Statistical
Analysis
of
DNA
Sequence
Data,
edited
by
B. S.
Weir
48.
Nonlinear
Regression
Modeling:
A
Unified
Practical
Approach,
David
A.
Ratkowsky
49.
Attribute
Sampling
Plans,
Tables
of
Tests
and
Confidence
Limits
for
Proportions,
Robert
E.
Odeh
and
D.
B.
Owen
50.
Experimental
Design,
Statistical
Models,
and
Genetic
Statistics,
edited
by
Klaus
Hinkelmann
51.
Statistical
Methods
for
Cancer
Studies,
edited
by
Richard
G.
Cornell
52.
Practical
Statistical
Sampling
for
Auditors,
Arthur
J.
Wilbum
53.
Statistical
Methods
for
Cancer
Studies,
edited
by
Edward
J.
Wegman
and
James
G.
Smith
54.
Self-Organizing
Methods
in
Modeling:
GMDH
Type
Algorithms,
edited
by
Stanley
J.
Farlow
55.
Applied
Factorial
and
Fractional
Designs,
Robert
A.
McLean
and
Virgil
L
Anderson
56.
Design
of
Experiments:
Ranking
and
Selection,
edited
by
Thomas
J.
Santner
and
Ajit
C.
Tamhane
57.
Statistical
Methods
for
Engineers
and
Scientists:
Second
Edition,
Revised
and
Expanded,
Robert
M.
Bethea,
Benjamin
S.
Duran,
and
Thomas
L.
Boullion
58.
Ensemble
Modeling:
Inference
from
Small-Scale
Properties
to
Large-Scale
Systems,
Alan
E.
Gelfand
and
Crayton
C.
Walker
59.
Computer
Modeling
for
Business
and
Industry,
Bruce
L
Bowerman
and
Richard
T.
O'Connell
60.
Bayesian
Analysis
of
Linear
Models,
Lyle
D.
Broemeling
61.
Methodological
Issues
for
Health
Care
Surveys,
Brenda
Cox and
Steven
Cohen
62.
Applied
Regression
Analysis
and
Experimental
Design,
Richard
J.
Brook
and
Gregory
C.
Arnold
63.
Statpal:
A
Statistical
Package
for
Microcomputers—PC-DOS
Version
for the IBM PC and
Compatibles,
Bruce
J.
Chalmer
and
David
G.
Whitmore
64.
Statpal:
A
Statistical
Package
for
Microcomputers—Apple
Version
for the II,
II+,
and
Me,
David
G.
Whitmore
and
Bruce
J.
Chalmer
65.
Nonparametric
Statistical
Inference:
Second
Edition,
Revised
and
Expanded,
Jean
Dickinson
Gibbons
66.
Design
and
Analysis
of
Experiments,
Roger
G.
Petersen
67.
Statistical
Methods
for
Pharmaceutical
Research
Planning,
Sten
W.
Bergman
and
John
C.
Gittins
68.
Goodness-of-Fit
Techniques,
edited
by
Ralph
B.
D'Agostino
and
Michael
A.
Stephens
69.
Statistical
Methods
in
Discrimination
Litigation,
edited
by D.
H.
Kaye
and
Mikel
Aickin
70.
Truncated
and
Censored
Samples
from
Normal
Populations,
Helmut
Schneider
71.
Robust
Inference,
M. L.
Tiku,
W.
Y.
Tan,
and
N.
Balakrishnan
72.
Statistical
Image
Processing
and
Graphics,
edited
by
Edward
J,
Wegman
and
Douglas
J.
DePriest
73.
Assignment
Methods
in
Combinatorial
Data
Analysis,
Lawrence
J.
Hubert
74.
Econometrics
and
Structural
Change,
Lyle
D.
Broemeling
and
Hiroki
Tsummi
75.
Multivariate
Interpretation
of
Clinical
Laboratory
Data,
Adelin
Albert
and
Eugene
K.
Harris
76.
Statistical
Tools
for
Simulation
Practitioners,
Jack
P. C.
Kleijnen
77.
Randomization
Tests:
Second
Edition,
Eugene
S.
Edgington
78. A
Folio
of
Distributions:
A
Collection
of
Theoretical
Quantile-Quantile
Plots,
Edward
B.
Fowlkes
79.
Applied
Categorical
Data
Analysis,
Daniel
H.
Freeman,
Jr.
80.
Seemingly
Unrelated
Regression
Equations
Models:
Estimation
and
Inference,
Virendra
K.
Srivastava
and
David
E. A.
Giles
81.
Response
Surfaces:
Designs
and
Analyses,
Andre
I.
Khuri
and
John
A.
Cornell
82.
Nonlinear
Parameter
Estimation:
An
Integrated
System
in
BASIC,
John
C.
Nash
and
Mary
Walker-Smith
83.
Cancer
Modeling,
edited
by
James
R.
Thompson
and
Barry
W.
Brown
84.
Mixture
Models:
Inference
and
Applications
to
Clustering,
Geoffrey
J.
McLachlan
and
Kaye
E.
Basford
85.
Randomized
Response:
Theory
and
Techniques,
Arijit
Chaudhuri
and
Rahul
Mukerjee
86.
Biopharmaceutical
Statistics
for
Drug
Development,
edited
by
Kari
E.
Peace
87.
Parts
per
Million
Values
for
Estimating
Quality
Levels,
Robert
E.
Odeh
and D. B.
Owen
88.
Lognormal
Distributions:
Theory
and
Applications,
edited
by
Edwin
L
Crow
and
Kunio
Shimizu
89.
Properties
of
Estimators
for the
Gamma
Distribution,
K.
O.
Bowman
and L R.
Shenton
90.
Spline
Smoothing
and
Nonparametric
Regression,
Randall
L.
Eubank
91.
Linear
Least
Squares
Computations,
R. W.
Farebrother
92.
Exploring
Statistics,
Damaraju
Raghavarao
93.
Applied
Time
Series
Analysis
for
Business
and
Economic
Forecasting,
Sufi
M.
Nazem
94.
Bayesian
Analysis
of
Time
Series
and
Dynamic
Models,
edited
by
James
C.
Spall
95. The
Inverse
Gaussian
Distribution:
Theory,
Methodology,
and
Applications,
Ray S.
Chhikara
and
J.
Leroy
Folks
96.
Parameter
Estimation
in
Reliability
and
Life
Span
Models,
A.
Clifford
Cohen
and
Betty
Jones
Whitten
97.
Pooled
Cross-Sectional
and
Time
Series
Data
Analysis,
Terry
E.
Dielman
98.
Random
Processes:
A
First
Look,
Second
Edition,
Revised
and
Expanded,
R.
Syski
99.
Generalized
Poisson
Distributions:
Properties
and
Applications,
P.
C.
Consul
100.
Nonlinear
L
p
-Norm
Estimation,
Rene
Gonin
and
Arthur
H.
Money
101.
Model
Discrimination
for
Nonlinear
Regression
Models,
Dale
S.
Borowiak
102.
Applied
Regression
Analysis
in
Econometrics,
Howard
E.
Doran
103.
Continued
Fractions
in
Statistical
Applications,
K.
O.
Bowman
and
L.
R.
Shenton
104.
Statistical
Methodology
in the
Pharmaceutical
Sciences,
Donald
A.
Berry
105.
Experimental
Design
in
Biotechnology,
Perry
D.
Haaland
106.
Statistical
Issues
in
Drug
Research
and
Development,
edited
by
Kail
E.
Peace
107.
Handbook
of
Nonlinear
Regression
Models,
David
A.
Ratkowsky
108.
Robust
Regression:
Analysis
and
Applications,
edited
by
Kenneth
D.
Lawrence
and
Jeffrey
L.
Arthur
109.
Statistical
Design
and
Analysis
of
Industrial
Experiments,
edited
by
Subir
Ghosh
110.
(^-Statistics:
Theory
and
Practice,
A.
J.
Lee
111.
A
Primer
in
Probability:
Second
Edition,
Revised
and
Expanded,
Kathleen
Subrahmaniam
112.
Data
Quality
Control:
Theory
and
Pragmatics,
edited
by
Gunar
E.
Liepins
and V. R. R.
Uppuluri
113.
Engineering
Quality
by
Design:
Interpreting
the
Taguchi
Approach,
Thomas
B.
Barker
114.
Survivorship
Analysis
for
Clinical
Studies,
Eugene
K.
Harris
and
Adelin
Albert
115.
Statistical
Analysis
of
Reliability
and
Life-Testing
Models:
Second
Edition,
Lee J.
Bain
and Max
Engelhardt
116.
Stochastic
Models
of
Carcinogenesis,
Wai-Yuan
Tan
117.
Statistics
and
Society:
Data
Collection
and
Interpretation,
Second
Edition,
Revised
and
Expanded,
Walter
J.
Federer
118.
Handbook
of
Sequential
Analysis,
6.
K.
Gnosh
and P. K. Sen
119.
Truncated
and
Censored
Samples:
Theory
and
Applications,
A.
Clifford
Cohen
120.
Survey
Sampling
Principles,
E. K.
Foreman
121.
Applied
Engineering
Statistics,
Robert
M.
Bethea
and R.
Russell
Rhinehart
122.
Sample
Size
Choice:
Charts
for
Experiments
with
Linear
Models:
Second
Edition,
Robert
E.
Odeh
and
Martin
Fox
123.
Handbook
of the
Logistic
Distribution,
edited
by
N.
Balakrishnan
124.
Fundamentals
of
Biostatistical
Inference,
Chap
T.
Le
125.
Correspondence
Analysis
Handbook,
J P.
Benzecri
126.
Quadratic
Forms
in
Random
Variables:
Theory
and
Applications,
A. M.
Mathai
and
Serge
B.
Provost
127.
Confidence
Intervals
on
Variance
Components,
Richard
K.
Burdick
and
Franklin
A.
Graybill
128.
Biopharmaceutical
Sequential
Statistical
Applications,
edited
by
Karl
E.
Peace
129.
Item
Response
Theory:
Parameter
Estimation
Techniques,
Frank
B.
Baker
130.
Survey
Sampling:
Theory
and
Methods,
Arijit
Chaudhuri
and
Horst
Stenger
131.
Nonparametric
Statistical
Inference:
Third
Edition,
Revised
and
Expanded,
Jean
Dickinson
Gibbons
and
Subhabrata
Chakraborti
132.
Bivariate
Discrete
Distribution,
Subrahmaniam
Kocherlakota
and
Kathleen
Kocherlakota
133.
Design
and
Analysis
of
Bioavailability
and
Bioequivalence
Studies,
Shein-Chung
Chow
and
Jen-
pei
Liu
134.
Multiple
Comparisons,
Selection,
and
Applications
in
Biometry,
edited
by
Fred
M.
Hoppe
135.
Cross-Over
Experiments:
Design,
Analysis,
and
Application,
David
A.
Ratkowsky,
Marc
A.
Evans,
and J.
Richard
Alldredge
136.
Introduction
to
Probability
and
Statistics:
Second
Edition,
Revised
and
Expanded,
Narayan
C.
Giri
137.
Applied
Analysis
of
Variance
in
Behavioral
Science,
edited
by
Lynne
K.
Edwards
',
138.
Drug
Safety
Assessment
in
Clinical
Trials,
edited
by
Gene
S.
Gilbert
139.
Design
of
Experiments:
A
No-Name
Approach,
Thomas
J.
Lorenzen
and
Virgil
L
Anderson
140.
Statistics
in the
Pharmaceutical
Industry:
Second
Edition,
Revised
and
Expanded,
edited
by C.
Ralph
Buncher
and
Jia-
Yeong
Tsay
141.
Advanced
Linear
Models:
Theory
and
Applications,
Song-Go;
Wang
and
Shein-Chung
Chow
142.
Multistage
Selection
and
Ranking
Procedures:
Second-Order
Asymptotics,
Nitis
Mukhopadhyay
and
Tumulesh
K. S.
Solanky
143.
Statistical
Design
and
Analysis
in
Pharmaceutical
Science:
Validation,
Process
Controls,
and
Stability,
Shein-Chung
Chow
and
Jen-pei
Liu
144.
Statistical
Methods
for
Engineers
and
Scientists:
Third
Edition,
Revised
and
Expanded,
Robert
M.
Bethea,
Benjamin
S.
Duran,
and
Thomas
L.
Bouillon
145.
Growth
Curves,
Anant
M.
Kshirsagar
and
William
Boyce
Smith
146.
Statistical
Bases
of
Reference
Values
in
Laboratory
Medicine,
Eugene
K.
Harris
and
James
C.
Boyd
147.
Randomization
Tests:
Third
Edition,
Revised
and
Expanded,
Eugene
S.
Edgington
148.
Practical
Sampling
Techniques:
Second
Edition,
Revised
and
Expanded,
Ran/an
K.
Som
149.
Multivariate
Statistical
Analysis,
Narayan
C.
Gin
150.
Handbook
of the
Normal
Distribution:
Second
Edition,
Revised
and
Expanded,
Jagdish
K.
Patel
and
Campbell
B.
Read
151.
Bayesian
Biostatistics,
edited
by
Donald
A.
Berry
and
Dalene
K.
Stangl
152.
Response
Surfaces:
Designs
and
Analyses,
Second
Edition,
Revised
and
Expanded,
Andre
I.
Khuri
and
John
A.
Cornell
153.
Statistics
of
Quality,
edited
by
Subir
Ghosh,
William
R.
Schucany,
and
William
B.
Smith
154.
Linear
and
Nonlinear
Models
for the
Analysis
of
Repeated
Measurements,
Edward
F.
Vonesh
and
Vemon
M.
Chinchilli
155.
Handbook
of
Applied
Economic
Statistics,
Aman
Ullah
and
David
E.
A.
Giles
156.
Improving
Efficiency
by
Shrinkage:
The
James-Stein
and
Ridge
Regression
Estimators,
Marvin
H.
J.
Gruber
157.
Nonparametric
Regression
and
Spline
Smoothing:
Second
Edition,
Randall
L
Eubank
158.
Asymptotics,
Nonparametrics,
and
Time
Series,
edited
by
Subir
Ghosh
159.
Multivariate
Analysis,
Design
of
Experiments,
and
Survey
Sampling,
edited
by
Subir
Ghosh
160.
Statistical
Process
Monitoring
and
Control,
edited
by
Sung
H.
Park
and
G.
Geoffrey
Vining
161.
Statistics
for the
21st
Century:
Methodologies
for
Applications
of the
Future,
edited
by C. R.
Rao
and
GaborJ.
Szekely
162.
Probability
and
Statistical
Inference,
NiOs
Mukhopadhyay
163.
Handbook
of
Stochastic
Analysis
and
Applications,
edited
by
D.
Kannan
and V.
Lak-
shmikantham
164.
Testing
for
Normality,
Henry
C.
Thode,
Jr.
Additional
Volumes
in
Preparation
Handbook
of
Applied
Econometrics
and
Statistical
Inference,
edited
by
Aman
Ullah,
Alan
T.
K.
Wan,
andAnoop
Ghaturvedi
Visualizing
Statistical
Models
and
Concepts,
R.
W.
Fanebrother
and
Michael
Schyns
HANDBOOK
OF
STOCHASTIC
ANALYSIS
AND
APPLICATIONS
EDITED
BY
D.
KANNAN
University
of
Georgia
Athens,
Georgia
V.
LAKSHMIKANTHAM
Florida
Institute
of
Technology
Melbourne,
Florida
MARCEL
MARCEL
DEKKER,
INC.
NEW
YORK
•
BASEL
ISBN:
0-8247-0660-9
This
book
is
printed
on
acid-free
paper.
Headquarters
Marcel
Dekker,
Inc.
270
Madison
Avenue,
New
York,
NY
10016
tel:
212-696-9000;
fax:
212-685-4540
Eastern
Hemisphere
Distribution
Marcel
Dekker
AG
Hutgasse
4,
Postfach
812,
CH-4001
Basel,
Switzerland
tel:
41-61-261-8482;
fax:
41-61-261-8896
World
Wide
Web
The
publisher
offers
discounts
on
this
book
when
ordered
in
bulk
quantities.
For
more
information,
write
to
Special
Sales/Professional
Marketing
at
the
headquarters
address
above.
Copyright
©
2002
by
Marcel
Dekker,
Inc.
All
Rights
Reserved.
Neither
this
book
nor any
part
may be
reproduced
or
transmitted
in any
form
or by
any
means,
electronic
or
mechanical,
including
photocopying,
microfilming,
and
recording,
or by any
information
storage
and
retrieval
system,
without
permission
in
writing
from
the
publisher.
Current
printing
(last
digit):
10
987654321
PRINTED
IN THE
UNITED
STATES
OF
AMERICA
Preface
Various
phenomena
arising
in
physics,
biology,
finance,
and
other
fields
of
study
are in-
trinsically
affected
by
random
noise,
(white
or
colored
noise).
One
thus
models
any
such
phenomenon
by an
appropriate
stochastic
process
or a
stochastic
equation.
An
analysis
of
the
resulting
process
or
equation
falls
in the
realm
of the
so-called
stochastic
analysis.
The
applicatory
value
of
stochastic
analysis
is
therefore
undeniable.
In
this
handbook
we
present
an
overview
of the
analysis
of
some
basic
stochastic
processes
and
stochastic
equations
along
with
some
selective
applications.
The
handbook
is
already
voluminous
even
with
this
limited
choice
of
topics,
and
therefore
we
hope
that
the
reader
will
forgive
us for
omissions.
This
handbook
on
stochastic
analysis
and
applications
contains
12
chapters.
The
first
six
chapters
of the
handbook
may be
considered
the
theoretical
half
(though
they
contain
several
illustrative
applications)
and the
remaining
six
chapters
the
applied
half.
Markov
processes
and
semimartingales
are two
predominant
processes
at the
foundation
of a
stochas-
tic
analysis.
The
first
two
chapters
present
a
clear
exposition
of
these
two
basic
processes.
These
chapters
include
material
on
Ito's
stochastic
calculus.
To
these
we
also
add
Chapter
3
presenting
the
important
white
noise
theory
of
Hida.
Stochastic
differential
equations
(SDEs)
are
extensively
used
to
model
various
phenomena
that
are
subject
to
random
per-
turbations.
Chapter
4
details
this
topic.
As in the
case
of
deterministic
equations,
one
needs
numerical
methods
to
analyze
SDEs.
The
numerical
analysis
of
SDEs
is a
fast-developing
area
that
is not as
rich
in
theory
as its
deterministic
counterpart
is.
Chapter
5
presents
an
up-to-date
account
of the
numerical
analysis
of
SDEs.
One can say
without
reservation
that
the
study
of
large
deviations
is
currently
the
most
active
area
of
research
in
probability,
finding
applications
in a
vast
number
of
fields.
Chapter
6
gives
a
thorough
survey
of
this
topic.
The
rest
of the
handbook
is on
applications.
Stochastic
control
methods
are
needed
or
alluded
to in
some
of
these
applications.
We
start
the set of
applied
chapters
with
meth-
ods
of
control
theory
and the
stabilization
of
control,
Chapter
7.
Game
theoretic
methods
applied
to
economics
helped
at
least
one to
earn
a
Nobel
prize
for
economics.
Chapter
8
presents
a
survey
of
stochastic
game
theory.
We
follow
this
with
Chapter
9 on
stochastic
manufacturing
systems
where
hierarchical
control
methods
are
used.
Chapter
10
presents
stochastic
algorithms
with
several
applications.
Chapter
11
applies
stochastic
methods
to
optimization
problems
(as
opposed
to
stochastic
optimization
methods).
The
final
chapter
is
on
stochastic
optimization
methods
applied
to
(stochastic)
financial
mathematics.
The
introductory
section
of
each
chapter
will
provide
details
on the
topics
covered
and the
rel-
evance
of
that
chapter,
so we
refrain
from
summarizing
them
in
detail
here.
Nevertheless,
we
will
mention
below
a few
simple
facts
just
to
introduce
those
chapters.
Markov
chains
and
processes
are,
informally,
randomized
dynamical
systems.
These
processes
are
used
as
models
in a
wide
range
of
applications.
Also,
the
theory
of
Markov
processes
is
well
developed.
The
handbook
opens
with
an
expository
survey
of
some
of the
main
topics
in
Markov
process
theory
and
applications.
Professor
Rabi
Bhattacharya,
who
has
published
numerous
research
articles
in
this
area
and
also
has
co-authored
a
popular
first-year
graduate
level
textbook
on
stochastic
processes
writes
this
chapter.
111
iv
PREFACE
It
would
hardly
be an
exaggeration
to say
that
semimartingale
theory
is
central
in any
stochastic
analysis.
These
processes
form
the
most
general
integrators
known
in
stochastic
calculus.
Chapter
2
presents
an
extensive
survey
of the
theory
of
this
important
process.
Professor
Jia-an
Yan,
the
author
of
Chapter
2, has
co-authored
an
excellent
book
on
this
subject.
Both
Chapter
1 and
Chapter
2
include
several
aspects
of
stochastic
calculus
that
form
a
basis
for
understanding
the
remaining
chapters.
Professor
H.H.
Kuo
has
researched
extensively
the
white
noise
calculus
of
Hida,
and
also
has
written
a
substantial
monograph
on
this
subject.
He
authors
Chapter
3.
Chapter
4
completes
a
cycle
of
stochastic
calculus
by
presenting
a
well-rounded
survey
of
the
theory
of
stochastic
differential
equations
(SDEs)
and is
written
by
Professor
Bo
Zhang,
who
specializes
in the
stability
analysis
of
stochastic
equations.
This
chapter
reviews
the
theory
of
SDEs,
which
is
fundamental
in a
vast
number
of
applications
in a
variety
of
fields
of
study,
and so
forms
a
basis
for
what
follows
in the
rest
of the
handbook
(except
for the
chapter
on
large
deviations).
The
longest
chapter
(Chapter
5) in the
handbook
is on the
numerical
analysis
of
stochas-
tic
differential
equations.
The
importance
of the
numerical
analysis
of
deterministic
systems
is
well
known.
Compared
to the
deterministic
case,
the
study
of the
numerical
methods
for
stochastic
equations
is
still
at a
developing
stage
(and
a
fast
one at
that).
This
chapter
is
important
due to its
multidisciplinary
character,
the
wide
range
of
potential
applications
of
stochastic
differential
equations,
and the
limitations
of
analytical
methods
for
SDEs
caused
by
their
high
complexity
and
partial intractability.
Professor
Henri
Schurz,
who
wrote
this
chapter,
has
co-authored
a
textbook
on the
numerical
analysis
of
SDEs
and
developed
an
accompanying
program
diskette.
He
presents
an
extensive
list
of
references
on
this
subject
here.
One
may say
without
much
hesitation
that
the
large
deviation
theory
is
currently
the
most
active
subject
of
research
in
probability.
Professors
Dembo
and
Zeitouni
have
not
only
done
extensive
research
in
this area
but
also
co-authored
a
popular
monograph
on
this topic.
Chapter
6 is an
up-to-date
survey
of
this
theory,
which
found
applications
in
many
areas
including
statistical
physics,
queuing
systems,
information
theory,
risk-sensitive
control,
stochastic
algorithms,
and
communication
networks.
This
chapter
includes
applications
to
hypothesis
testing
in
statistics
and the
Gibbs
conditioning
principle
in
statistical
mechanics.
The
remaining
half
of the
handbook
is on
applications;
regrettably
a lot of
important
applications
are not
included
due to
space
constraints.
Control
theory
and
stabilization
of
controls
is the
subject
matter
of
Chapter
7
written
by
Professor
Pavel
Pakshin.
The
dynamic
programming
and
maximum
principle
methods
are
detailed
in the
chapter.
The
separation
principle
is
used
for the
solution
of the
standard
linear-quadratic
Gaussian
(LQG)
control
problem.
Chapters
9 and 12
extensively
use the
control
theory
methods
in
applications
to
stochastic
manufacturing
systems
and
asset
pricing,
respectively.
Chapter
8,
written
by
Professor
K.M.
Ramachandran,
discusses
stochastic
game
theory.
Recently,
three
prominent
researchers
in
game
theory
won the
Nobel
prize
for
economics.
This
vouches
for the
importance
of
game
theory,
both
deterministic
and
stochastic.
The
chapter
includes both
the
two-person
zero-sum
games
and
N-person
non-cooperative games.
Emphasis
is
placed
on
solution methods,
old and
new.
Applications
to
defense,
finances,
economics,
institutional
investor
speculation,
etc,
are
presented.
Stochastic
control
theory
enriched
the
analysis
of
manufacturing
systems.
Professor
Qing
Zhang
who
wrote Chapter
9 has
also co-authored
the
first
authoritative monograph
on
stochastic
manufacturing
systems.
Chapter
9
includes
the
theory
and
applications
developed
since
the
appearance
of
that
monograph.
Manufacturing
systems
are
usually
large
and
complex,
and are
subject
to
various
discrete
events
such
as
purchasing
new
equipment
and
machine
failures
and
repairs.
Due to the
large
size
of
these
systems
and the
presence
of
these
events,
obtaining exact optimal
feedback
policies
to run
these
systems
is
nearly impossible
PREFACE
v
both
theoretically
and
computationally.
Only
small-sized
problems
are
addressed
even
in
approximation
of
solutions.
Therefore,
these
systems
are
managed
in a
hierarchical
fashion.
The
reduction
in
complexity
is
achieved
by
decomposing
the
problem
into
problems
of
the
smaller
subsystems
with
a
proper
coordinating
mechanism,
aggregating
products
and
subsequently
disaggregating
them,
and
replacing
random
processes
with
their
averages.
This
chapter
adopts
the
latter
method.
Professor
George
Yin
reviews
stochastic
approximations
and
their
applications
in
Chap-
ter 10. He
presents
various
forms
of
stochastic
approximation
algorithms,
projections
and
truncation
procedures,
algorithms
with
soft
constraints,
and
global
stochastic
approxima-
tion
algorithms,
among
other
methods.
The
utility
of
stochastic
approximation
methods
is
demonstrated
with
applications
to
adaptive
filtering,
system
identification,
stopping
time
rules
for
least
squares
algorithm,
adaptive
step-size
tracking
algorithms,
approximation
of
threshold
control
policies,
GI/G/1
queues,
distributed
algorithms
for
supervised
learning,
etc.
George
Yin has
co-authored
a
book
on
this
topic
and
this
chapter
includes
recent
results.
Chapter
11,
written
by
Professor
Ron
Shonkwiler,
is on
stochastic
methods
for
global
optimization.
Until
the
stochastic
methods
came
along,
there
were
no
good
general
methods
addressing
global
optimization.
Stochastic
methods
are
simple
to
implement,
versatile,
and
robust,
and
they
parallelize
effectively.
These
methods
often
mimic
some
natural
process
such
as
temperature-based
annealing
or
biological
recombination.
The
theory
behind
these
methods
is
built
on the
theory
of
Markov
chains
and
renewal
theory,
and it
provides
a
framework
for
illuminating
their
strengths
and
weaknesses.
Detailed
descriptions
of the
basic
algorithms
are
provided
along
with
comparisons
and
contrasts.
Professor
Thaleia
Zariphopoulou
wrote
the
final
chapter
(Chapter
12),
which
is on
stochastic
control
methods
in
asset
pricing,
and she is an
active
researcher
in
this
area.
Most
of
the
valuation
models
lead
to
stochastic
optimization
problems.
This
chapter
presents
an
exposition
of
stochastic
optimization
methods
used
in
financial
mathematics
along
with
a
quick
summary
of
results
on the
Hamilton-Jacobi-Bellman
(HKB)
equation.
In
addition
to
optimization
models
of
expected
utility
in
complete
markets
as
well
as
markets
with
frictions,
this
chapter
provides
models
of
derivative
pricing.
Acknowledgments
The
editors
express
their
deep
sense
of
gratitude
to all the
authors
who
contributed
to
the
Handbook
of
Stochastic
Analysis
and
Applications.
Obviously,
this
handbook
would
not
have
been
possible
without
their
help.
Mrs.
Sharon
Southwick
provided
local
computer
help
to
D.
Kannan.
She
developed
the
uniform
code
to
compile
all the
articles
in one
file.
The
editors
are
very
thankful
for all her
help.
The
editors
are
also
thankful
to the
editorial
staff
of
Marcel
Dekker,
Inc.
in
particular
to
Maria
Allegra
and
Brian
Black
for
their
patience
and
cooperation
during
the
long
process
of
bringing
out the
handbook.
D.
Kannan
V.
Lakshmikantham
Contents
Preface
iii
Contributors
xvii
1
Markov
Processes
and
Their
Applications
1
Rabi
Bhattacharya
1.1
Introduction
1
1.2
Markov
Chains
4
1.2.1
Simple
Random
Walk
5
1.2.2
Birth-Death
Chains
and the
Ehrenfest
Model
6
1.2.3
Galton-Watson
Branching
Process
7
1.2.4
Markov
Chains
in
Continuous
Time
8
1.2.5
References
11
1.3
Discrete
Parameter
Markov
Processes
on
General
State
Spaces
11
1.3.1
Ergodicity
of
Harris
Recurrent
Processes
11
1.3.2
Iteration
of
I.I.D.Random
Maps
14
1.3.3
Ergodicity
of
Non-Harris
Processes
17
1.3.4
References
19
1.4
Continuous
Time
Markov
Processes
on
General
State
Spaces
20
1.4.1
Processes
with
Independent
Increments
20
1.4.2
Jump
Processes
21
1.4.3
References
22
1.5
Markov
Processes
and
Semingroup
Theory
22
1.5.1
The
Hille-Yosida
Theorem
23
1.5.2
Semigroups
and
One-Dimensional
Diffusions
25
1.5.3
References
31
1.6
Stochastic
Differential
Equations
31
1.6.1
Stochastic
Integrals,
SDE,
Ito's
Lemma
32
1.6.2
Cameron—Martin—Girsanov
Theorem and the
Martingale
Problem
35
1.6.3
Probabilistic
Representation
of
Solutions
to
Elliptic
and
Parabolic
Partial
Differential
Equations
38
1.6.4
References 39
Bibliography
41
2
Semimartingale
Theory and
Stochastic
Calculus
47
Jia-An
Yan
2.1
General
Theory
of
Stochastic
Processes
and
Martingale
Theory
48
VII
viii
CONTENTS
2.1.1
Classical
Theory
of
Martingales
48
2.1.2
General
Theory
of
Stochastic
Processes
52
2.1.3
Modern
Martingale
Theory
60
2.2
Stochastic
Integrals 68
2.2.1
Stochastic
Integrals
w.r.t.
Local
Martingales
68
2.2.2
Stochastic
Integrals
w.r.t.
Semimartingales
72
2.2.3
Convergence
Theorems
for
Stochastic
Integrals
75
2.2.4
Ito's
Formula
and
Doleans
Exponential
Formula
78
2.2.5
Local
Times
of
Semimartingales
81
2.2.6
Fisk-Stratonovich
Integrals
82
2.2.7
Stochastic
Differential
Equations
84
2.3
Stochastic
Calculus
on
Semimartingales
87
2.3.1
Stochastic
Integration
w.r.t.Random
Measures
87
2.3.2
Characteristics
of a
Semimartingale
90
2.3.3
Processes
with
Independent
Increments
and
Levy
Processes
91
2.3.4
Absolutely
Continuous
Changes
of
Probability
94
2.3.5
Martingale
Representation
Theorems
99
Bibliography
103
3
White
Noise
Theory
107
Hui-Hsuing
Kuo
3.1
Introduction 107
3.1.1
What
is
white
noise?
107
3.1.2
White
noise
as the
derivative
of a
Brownian
motion
107
3.1.3
The use of
white
noise—a
simple
example
108
3.1.4
White
noise
as a
generalized
stochastic
process
109
3.1.5
White
noise
as an
infinite
dimensional
generalized
function
110
3.2
White
noise
as a
distribution
theory
Ill
3.2.1
Finite
dimensional
Schwartz
distribution
theory
Ill
3.2.2
White
noise
space
112
3.2.3
Hida's
original
idea
112
3.2.4
Spaces
of
test
and
generalized
functions
114
3.2.5
Examples
of
test
and
generalized
functions
115
3.3
General
spaces
of
test
and
generalized
functions
117
3.3.1
Abstract
white
noise
space
117
3.3.2 Wick
tensors
118
3.3.3
Hida-Kubo-Takenaka
space
119
3.3.4
Kondratiev-Streit
space
120
3.3.5
Cochran-Kuo-Sengupta
space
121
3.4
Continuous
versions
and
analytic
extensions
123
3.4.1
Continuous
versions
123
3.4.2
Analytic
extensions
125
3.4.3
Integrable
functions
126
3.4.4
Generalized
functions
induced
by
measures
128
3.4.5
Generalized
Radon-Nikodym
derivative
129
3.5
Characterization
theorems
131
3.5.1
The
S-transform
131
3.5.2
Characterization
of
generalized
functions
132
3.5.3
Convergence
of
generalized
functions
1
136
3.5.4
Characterization
of
test
functions
137
3.5.5
Intrinsic
topology
for the
space
of
test
functions
139
CONTENTS
ix
3.6
Continuous
operators
and
adjoints
140
3.6.1
Differential
operators
140
3.6.2
Translation
and
scaling operators
143
3.6.3
Multiplication
and
Wick
product
144
3.6.4
Fourier-Gauss
transform
146
3.6.5
Extensions
to
CKS-spaces
148
3.7
Comments
on
other
topics
and
applications
150
Bibliography
155
4
SDEs
and
Their
Applications
159
Bo
Zhang
4.1
SDEs
with
respect
to
Brownian
motion
160
4.1.1
Ito
type
SDEs
160
4.1.2
Properties
of
solutions
163
4.1.3
Equations
depending
on a
parameter
165
4.1.4
Stratonovich
Stochastic
Differential
Equations
167
4.1.5
Stochastic
Differential
Equations
on
Manifolds
168
4.2
Applications
169
4.2.1
Diffusions
169
4.2.2
Boundary
value
problem
173
4.2.3
Optimal
stopping
176
4.2.4
Stochastic
control
180
4.2.5
Backward
SDE
and
applications
185
4.3
Some
generalizations
of
SDEs
191
4.3.1
SDEs
of the
jump
type
191
4.3.2
SDE
with
respect
to
semimartingale
198
4.3.3
SDE
driven
by
nonlinear
integrator
204
4.4
Stochastic
Functional
Differential
Equations
212
4.4.1
Existence
and
Uniqueness
of
Solution
212
4.4.2
Markov
property
215
4.4.3
Regularity
of the
trajectory
field
217
4.5
Stochastic
Differential
Equations
in
Abstract
Spaces
219
4.5.1
Stochastic
evolution
equations
219
4.5.2
Dissipative
stochastic
systems
222
4.6
Anticipating
Stochastic
Differential
Equation
224
4.6.1
Volterra
equations
with
anticipating
kernel
224
4.6.2
SDEs
with
anticipating
drift
and
initial
condition
227
Bibliography
229
5
Numerical
Analysis
of
SDEs
Without
Tears
237
H.
Schurz
5.1
Introduction
237
5.2
The
Standard
Setting
For
(O)SDEs
238
5.3
Stochastic
Taylor
Expansions
241
5.3.1
The Ito
Formula
(Ito's
Lemma)
241
5.3.2
The
main
idea
of
stochastic
Ito's-Taylor
expansions
241
5.3.3
Hierarchical
sets,
coeffcient
functions,
multiple
integrals
243
5.3.4
Amore
compact
formulation
243
5.3.5
The
example
of
Geometric
Brownian
Motion
244
5.3.6 Key
relations
between
multiple
integrals
245
5.4 A
Toolbox
of
Numerical
Methods
246
CONTENTS
5.4.1
The
explicit
and
fully
drift-implicit
Euler
method
246
5.4.2 The
family
of
stochastic
Theta
methods
247
5.4.3
Trapezoidal
and
midpoint
methods
248
5.4.4
Rosenbrock
methods
(RTMs)
248
5.4.5
Balanced
implicit
methods
(BIMs)
249
5.4.6
Predictor-corrector
methods
(PCMs)
249
5.4.7
Explicit
Runge-Kutta
methods
(RKMs)
250
5.4.8
Newton's
method
251
5.4.9
The
explicit
and
implicit
Mil'shtein
methods
252
5.4.10
Gaines's
representation
of
Mil'shtein
method
253
5.4.11
Generalized
Theta-Platen
methods
254
5.4.12
Talay-Tubaro
extrapolation
technique
and
linear
PDEs
254
5.4.13
Denk-Hersch
method
for
highly
oscillating
systems
255
5.4.14
Stochastic
Adams-type
methods
257
5.4.15
The two
step
Mil'shtein
method
of
Horvath-Bokor
258
5.4.16
Higher
order
Taylor
methods
258
5.4.17
Splitting
methods
of
Petersen-Schurz
258
5.4.18
The ODE
method
with
commutative
noise
260
5.4.19
Random
local
linearization
methods
(LLM)
262
5.4.20
Simultaneous
time
and
chance
discretizations
264
5.4.21
Stochastic
waveform
relaxation
methods
264
5.4.22
Comments
on
numerical
analysis
of
SPDEs
264
5.4.23
General
concluding
comment
on
numerical
methods
265
5.5
On the
Main
Principles
of
Numerics
265
5.5.1
ID-invariance
265
5.5.2
Numericalpih
mean
consistency
266
5.5.3
Numericalpth
mean
stability
266
5.5.4
Numerical
pth
mean
contractivity
267
5.5.5
Numerical
pth
mean
convergence
267
5.5.6
The
main
principle:
combining
all
concepts
from
5.1-5.5
268
5.5.7
On
fundamental
crossrelations
273
5.6
Results
on
Convergence
Analysis
276
5.6.1
Continuous
time
convergence
concepts
276
5.6.2
On key
relations
between
convergence
concepts
278
5.6.3
Fundamental
theorems
of
mean
square
convergence
278
5.6.4
Strong
mean
square
convergence
theorem
280
5.6.5
The
Clark-Cameron
mean
square
order
bound
in
IR
1
280
5.6.6
Exact
mean
square
order
bounds
of
Cambanis
and
Hu
282
5.6.7
Atheorem
on
double
L
2
-convergence
with
adaptive
A 284
5.6.8 The
fundamental
theorem
of
weak
convergence :
285
5.6.9
Approximation
of
some
functionals
286
5.6.10
The
pathwise
error
process
for
explicit
Euler
methods
289
5.6.11
Almost
sure
convergence
289
5.7
Numerical
Stability,
Stationarity,
Boundedness,
and
Invariance
291
5.7.1
Stability
of
linear
systems
with
ultiplicative
noise
291
5.7.2
Stationarity
of
linear
systems
with
additive
noise
294
5.7.3
Asymptotically
exact
methods
for
linear
systems
296
5.7.4
Almost
sure
nonnegativity
of
numerical
methods
297
5.7.5
Numerical
invariance
of
intervals
[0
,
M]
299
5.7.6
Preservation
of
boundaries
for
Brownian
Bridges
301
5.7.7
Nonlinear
stability
of
implicit Euler
methods
302
CONTENTS
xi
5.7.8 Linear and nonlinear
A-stability
303
5.7.9
Stability
exponents
of
explicit-implicit methods
304
5.7.10
Hofmann-Platen's
M-stability
concept
in
C
1
306
5.7.11
Asymptotic
stability
with
probability
one
308
5.8
Numerical
Contractivity
309
5.8.1 Contractivity
of
SDEs
with
monotone
coeffcients
309
5.8.2
Contractivity
of
implicit
Euler
methods
310
5.8.3
pth
mean
B-
and
BN-stability
310
5.8.4
Contractivity
exponents
of
explicit-implicit
methods
311
5.8.5 General
V-asymptotics
of
discrete
time
iterations
312
5.8.6
An
example
for
discrete
time
V-asymptotics
314
5.8.7
Asymptotic
Contractivity
with probability
one
317
5.9
On
Practical
Implementation
317
5.9.1 Implementation
issues:
some
challenging examples
317
5.9.2
Generation
of
pseudorandom
numbers
321
5.9.3
Substitutions
of
randomness
under
weak
convergence
323
5.9.4
Are
quasi
random numbers
useful
for
(O)SDEs
324
5.9.5 Variable
step
size algorithms
325
5.9.6 Variance reduction
techniques
326
5.9.7 How to
estimate
pth
mean
errors
328
5.9.8
On
software
and
programmed
packages
329
5.9.9
Comments
on
applications
of
numerics
for
(O)SDEs
329
5.10
Comments,
Outlook,
Further
Developments
330
5.10.1
Recent
and
further
developments
330
5.10.2 General
comments
330
5.10.3
Acknowledgements
331
5.10.4
New
trends
-10
challenging
problem
areas
331
Bibliography 333
6
Large
Deviations
and
Applications
361
Amir
Dembo
and
Ofer
Zeitouni
6.1
Introduction 361
6.2
The
Large
Deviation
Principle
363
6.3
Large Deviation Principles
for
Finite
Dimensional
Spaces
365
6.3.1
The
Method
of
Types
366
6.3.2
Cramer's
Theorem
in
IR
d
368
6.3.3
The
Gartner-Ellis
Theorem
369
6.3.4
Inequalities
for
Bounded
Martingale
Differences
371
6.3.5
Moderate
Deviations
and
Exact
Asymptotics
372
6.4
General
Properties
373
6.4.1
Existence
of an
LDP
and
Related
Properties
374
6.4.2
Contraction
Principles
and
Exponential Approximation
376
6.4.3
Varadhan's
Lemma
and its
Converse
380
6.4.4
Convexity
Considerations
382
6.4.5
Large
Deviations
for
Projective
Limits
385
6.5
Sample
Path
LDPs
388
6.5.1
Sample
Path
Large
Deviations
for
Random Walk
and for
Brownian
Motion
388
6.5.2
The
Freidlin-Wentzell
Theory
390
6.5.3
Application:
The
Problem
of
Diffusion
Exit
from
a
Domain
392
6.6
LDPs
for
Empirical Measures
396
xii
CONTENTS
6.6.1
Cramer's
Theorem
in
Polish
Spaces
396
6.6.2
Sanov's
Theorem 399
6.6.3
LDP
for
Empirical
Measures
of
Markov
Chains
401
6.6.4
Mixing
Conditions
and LDP
404
6.6.5
Application:
The
Gibbs
Conditioning
Principle
406
6.6.6
Application:
The
Hypothesis
Testing
Problem
410
Bibliography
413
7
Stability
and
Stabilizing
Control
of
Stochastic
Systems
417
P. V.
Pakshin
7.1
Stochastic
mathematical
models
of
systems
419
7.1.1
Models
of
differential
systems
corrupted
by
noise
419
7.1.2
Models
of
differential
systems with
random
jumps ;
422
7.1.3
Differential
generator
423
7.2
Stochastic
control
problem
425
7.2.1
Preliminaries
425
7.2.2
Stochastic
dynamic
programming
426
7.2.3
Stochastic
maximum
principle
430
7.2.4
Separation
principle
432
7.3
Definition
of
stochastic
stability
and
stochastic
Lyapunov
function
438
7.3.1
Classic
stability
concept
438
7.3.2
Weak
Lyapunov
stability
438
7.3.3 Strong
Lyapunov
stability
439
7.3.4
Mean
square
and
p-stability
439
7.3.5
Recurrence
and
positivity
440
7.3.6
Stochastic
Lyapunov
function
441
7.4
General
stability
and
stabilization theorems
442
7.4.1
Stability
in
probability
theorems
442
7.4.2
Recurrence
and
positivity theorems
442
7.4.3pth
mean
stability
theorems
and
their
inversion
443
7.4.4
Stability
in the
first
order
approximation
446
7.4.5
Stabilization
problem
and
fundamental
theorem
447
7.5
Instability 448
7.5.1 Classic stochastic
instability
concept
448
7.5.2
Nonpositivity
and
nonrecurrence
450
7.6
Stability
criteria
and
testable conditions
451
7.6.1 General stability
tests
for
linear
systems
451
7.6.2
Some
particular
stability
criteria
for
linear
systems
452
7.6.3
Stability
of
thepth
moments
of
linear
systems
454
7.6.4
Absolute
stochastic
stability
455
7.6.5
Robust
stability
456
7.7
Stabilizing
control
of
linear system
458
7.7.1
General
linear
systems
458
7.7.2
Linear
systems
with
parametric
noise
459
7.7.3
Robust
stabilizing
control
464
Bibliography
467
8
Stochastic
Differential
Games
and
Applications
473
K.
M.
Ramachandran
8.1
Introduction
473
8.2
Two
person
zero-sum
differential
games
475
CONTENTS
xiii
8.2.1
Two
person
zero-sum
games:
martingale
methods
475
8.2.2
Two
person
zero-sum
games
and
viscosity
solutions
484
8.2.3
Stochastic
differential
games
with
multiple
modes
487
8.3
TV-Person
stochastic
differential
games
490
8.3.1
Discounted
payoff
on the
infinite
horizon
491
8.3.2
Ergodic
payoff
492
8.4
Weak
convergence
methods
in
differential
games
498
8.4.1
Weak
convergence
preliminaries
498
8.4.2
Weak
convergence
in
Af-person
stochastic
differential
games
500
8.4.3
Partially
observed
stochastic
differential
games
and
weak
convergence
510
8.5
Applications
518
8.5.1
Stochastic
equity
investment
model
with
institutional
investor
speculation
519
8.6
Conclusion 523
Bibliography
525
9
Stochastic
Manufacturing
Systems:
A
Hierarchial
Control
Approach
533
Q.
Zhang
9.1
Introduction
533
9.2
Single
Machine
System
535
9.3
Flowshops 538
9.4
Jobshops
541
9.5
Production—Capacity
Expansion
Models
542
9.6
Production-Marketing
Models
548
9.7
Risk-Sensitive
Control
550
9.8
Optimal
Control 553
9.9
Hierarchical
Control
555
9.10
Risk-Sensitive
Control 557
9.11
Constant
Product
Demand
560
9.12
Constant
Machine
Capacity
566
9.13
Marketing-Production
with
a
Jump
Demand
568
9.14
Concluding
Remarks
;
571
Bibliography
573
10
Stochastic
Approximation:
Theory
and
Applications
577
G.
Yin
10.1
Introduction
577
10.1.1
Historical
Development
578
10.1.2
Basic
Issues 579
10.1.3
Outline
of the
Chapter
579
10.2
Algorithms
and
Variants
579
10.2.1
Basic
Algorithm
579
10.2.2
More
General
Algorithms
581
10.2.3
Projection
and
Truncation
Algorithms
582
10.2.4
Global
Stochastic
Approximation
584
10.2.5
Continuous-time
Stochastic
Approximation
Algorithms
585
10.2.6
Stochastic
Approximation
in
Function
Spaces
585
10.3
Convergence
585
10.3.1
ODE
Methods
586
10.3.2
Weak
Convergence
Method
588
xiv
CONTENTS
10.4
Rates
of
Convergence
590
10.4.1
Scaling
Factor
a
590
10.4.2
Tightness
of the
Scaled
Estimation
Error
591
10.4.3
Local
Analysis 592
10.4.4
Random
Directions
594
10.4.5
Stopping
Rules
594
10.5
Large
Deviations
594
10.5.1
Motivation
595
10.5.2
Large
Deviations
for
Stochastic
Approximation
595
10.6
Asymptotic
Efficiency
596
10.6.1 Iterate
Averaging
597
10.6.2
Smoothed
Algorithms 598
10.6.3
Some
Numerical
Data
600
10.7
Applications 601
10.7.1
Adaptive
Filtering
602
10.7.2
Adaptive
Beam
Forming
602
10.7.3
System
Identification
and
Adaptive
Control
603
10.7.4
Adaptive
Step-size
Tracking
Algorithms
605
10.7.5
Approximation
of
Threshold
Control
Policies
606
10.7.6
GI/G/1
Queue
607
10.7.7
Distributed
Algorithms
for
Supervised Learning
608
10.7.8
A
Heat
Exchanger
610
10.7.9
Evolutionary
Algorithms
612
10.7.10
Digital
Diffusion
Machines
613
10.8
Further
Remarks
614
10.8.1
Convergence 614
10.8.2
Rate
of
Convergence
615
10.8.3
Law of
Iterated
Logarithms
615
10.8.4
Robustness
616
10.8.5
Parallel
Stochastic Approximation
616
10.8.6
Open
Questions
617
10.8.7
Conclusion 617
Bibliography
619
11
Optimization
by
Stochastic
Methods
625
Franklin
Mendivil,
R.
Shonkwiler,
and
M.C.
Spruill
11.1 Nature
of the
problem
625
11.1.1
Introduction 625
11.1.2
No
Free
Lunch
626
11.1.3
The
Permanent
Problem
628
11.2
A
Brief
Survey
of
Some
Methods
for
Global
Optimization
629
11.2.1
Covering
Methods 630
11.2.2
Branch
and
bound
631
11.2.3
Iterative
Improvement 632
11.2.4
Trajectory/tunneling
Methods
633
11.2.5
Tabu
search 634
11.2.6
Random
Search 634
11.2.7
Multistart 635
11.3
Markov
Chain
and
Renewal
Theory
Considerations
635
11.3.1
IIP
parallel
search
638
11.3.2
Restarted
Improvement
Algorithms
639
CONTENTS
xv
11.3.3
Renewal
Techniques
in
Restarting
642
11.4
Simulated
Annealing 644
11.4.1
Introduction
644
11.4.2
Simulated
annealing
applied
to the
permanent
problem :
646
11.4.3
Convergence
Properties
of
Simulated
Annealing
and
Related
Algorithms
647
11.5
Restarted
Algorithms
653
11.5.1
Introduction
653
11.5.2
The
Permanent
Problem
using
restarted
simulated
annealing
654
11.5.3
Restarted
Simulated
Annealing
655
11.5.4
Numerical
comparisons
656
11.6
Evolutionary
Computations
658
11.6.1
Introduction
658
11.6.2
A
GA
for the
permanent
problem
660
11.6.3
Some
specific
Algorithms
661
11.6.4
GA
principles,
schemata,
multi-armed
bandit,
implicit
parallelism
.
662
11.6.5
A
genetic
algorithm
for
constrained
optimization
problems
667
11.6.6
Markov
Chain
Analysis
Particular
to
Genetic
Algorithms
670
Bibliography
673
12
Stochastic
Control
Methods
in
Asset
Pricing
679
Thaleia
Zariphopoulou
12.1
Introduction
679
12.2
The
Hamilton-Jacobi-Bellman
(HJB)
equation
680
12.3
Models
of
Optimal
Investment
and
Consumption
I
684
12.3.1
Merton
models
with
intermediate
consumption
687
12.3.2
Merton
models
with
non-linear
stock
dynamics
689
12.3.3
Merton
models
with
trading
constraints
691
12.3.4
Merton
models
with
non-homogeneous
investment
opportunities
693
12.3.5
Models
of
Optimal
Portfolio
Management
with
General
Utilities
699
12.3.6
Optimal
goal
problems
703
12.3.7
Alternative
models
of
expected
utility
705
12.4
Models
of
optimal
investment
and
consumption
II
707
12.4.1
Optimal
investment/consumption
models
with
transaction
costs
707
12.4.2
Optimal
investment/consumption
models
with
stochastic
labor
income
719
12.5
Expected
utility
methods
in
derivative
pricing
723
12.5.1
The
Black
and
Scholes
valuation
formula
725
12.5.2
Super-replicating
strategies
727
12.5.3
The
utility
maximization
theory
729
12.5.4
Imperfect
hedging
strategies
738
12.5.5
Other
models
of
derivative
pricing
with
transaction
costs
742
Bibliography
745
Index
754
Contributors
Rabi
Bhattacharya
Indiana
University,
Bloomington,
Indiana
Amir
Dembo
Stanford
University,
Stanford,
California
Hui-Hsiung
Kuo
Louisiana
State
University,
Baton
Rouge,
Louisiana
Franklin
Mendivil
Georgia
Institute
of
Technology,
Atlanta,
Georgia
P. V.
Pakshin
Nizhny
Novgorod
State
Technical
University
at
Arzamas,
Arzamas,
Russia
K.
M.
Ramachandran
University
of
South
Florida,
Tampa,
Florida
R.
Shonkwiler
Georgia
Institute
of
Technology,
Atlanta,
Georgia
M.
C.
Spruill
Georgia
Institute
of
Technology,
Atlanta,
Georgia
H.
Schurz
University
of
Minnesota,
Minneapolis,
Minnesota
Jia-An
Yan
Chinese
Academy
of
Sciences,
Beijing,
China
G.
Yin
Wayne
State
University,
Detroit,
Michigan
Thaleia
Zariphopoulou
The
University
of
Texas
at
Austin,
Austin,
Texas
Ofer
Zeitouni
Technion,
Haifa,
Israel
Bo
Zhang
People's
University
of
China,
Beijing,
China
Q.
Zhang
University
of
Georgia,
Athens,
Georgia
xvu
HANDBOOK
OF
STOCHASTIC
ANALYSIS
AND
APPLICATIONS
Chapter
1
Markov
Processes
and
Their
Applications
RABI
BHATTACHARYA
Department
of
Mathematics
Indiana
University
Bloomington,
Indiana
1.1
Introduction
For
the
most
part
in
this
chapter
we
will
confine
ourselves
to
time-homogeneous
Markov
processes.
In
discrete
time,
such
a
Markov
process
on a
(measurable)
state
space
(S,S)
is
defined
by a
(one-step)
transition
probability
p(x,dy),
x
6
S,
where
(i)
for
each
x
e
S,
p(x,dy)
is a
probability
measure
on
(S,S)
and
(ii)
for
each
B
€
S,x
—
>
p(x,B)
is a
measurable
function
on
(5,5)
into
([0,
1],6([0,
1]).
Here
B(X]
denotes
the
Bore]
a-field
on
atopological
space
X.
Let
QQ
=
S°°
be the
space
of all
sequences
x
=
(XQ,XI,
• • •
,x
n
,
• • •
)
in
S,
fio
being
endowed
with
the
product
cr-field
J^o
=
<5®°°
generated
by the
class
of all
finite-
dimensional
measurable
cylinders
of the
form
A
=
B x
5°°
=
{x
G
5°°
:
Xj
€
B
3
;
,
0
<
j
<
n}
with
Bj
£ S for j
=
0,
1,
• • •
,
n
and
n
arbitrary.
For any
given
probability
measure
/j,
on
(5, S) one can
construct
a
unique
probability
measure
P^
on
(Q,
F)
by
assigning
to
cylinder
sets
A of the
above
form
the
probability
P»(A)=
f
I
•••
I
I
p(z
n
_i,B
n
)p(a;n-2,da;
n
_i)
J
BO J
BI
J
B
n
—
%
J
B
n
—
\
.
(1-1-1)
evaluated
by
iterated
integration.
In the
case
S is a
Polish
space,
i.e.,
5 is
homeomorphic
to
a
complete
separable
metric
space,
and S
=
B(S),
such
a
construction
of a
P
M
is
provided
by
Kolmogorov's
Existence
Theorem (See
Billingsley
[1],
pp.
486-490).
For
general
state
spaces
(S,S)
this
construction
is due to
Tulcea
[2]
(Also
see
Nevue
[3],
pp.
161-166).
The
coordinate
process
{X
n
:
n
=
0,
1,
• • •
}
denned
on
(S°°,S®°°)
by
X
n
(x)
=
x
n
(x
=
(XQ,
Xi,
•
•
•
,
x
n
,
•
• •
))
is a
Markov
process
with
transition
probability
p(x,
dy)
and
initial
dis-
tribution
fj,.
In
other
words,
the
conditional
distribution
of the
process
X+
:=
(X
n
,
X
n
+\,
•
•
•
)
on
(S°°,
S®
00
)
given
F
n
:=
cr{Xj
:
0
<
j
<
n},
namely
the
cr-field
of
past
and
present
events
up
to
time
n, is
Px
n
,
where
P
y
is
written
for
P
M
with
^
=
S
y
,
i.e.,
n({y})
=
1.
Often
one
needs
a
larger
probability
space
than
this
canonical
model
(S°°
,
S®°°
,
P^)
e.g.,
to
accom-
modate
a
family
of
random
variables
independent
of the
process
{X
n
:
n
=
0,
1,2,
•
••}.
2
CHAPTER
1.
MARKOV
PROCESSES
AND
THEIR
APPLICATIONS
Hence
we
will
consider
a
general
probability
space
(O,
F, P) on
which
is
defined
a
sequence
(Xo,
Xi,
• •
•
,
X
n
,
•
•
•)
whose
distribution
is
P^
given
by
(10.7.65).
Sometimes
a
Markov
process,
or its
transition
probability
p(x,
dy),
may
admit
an
invari-
ant
probability
ir(dy),
i.e.,
/
p(x,
B)Tt(dx)
=
7r(J5)
VB
e
S.
(1.1.2)
Js
In
this
case
if one
takes
fj,
=
TT
as the
initial
distribution,
then
the
Markov
process
{X
n
:
n
=
0,
l,-2,
• • •
}
is
stationary
in the
sense
that
X+
=
(X
n
,X
n+
i,
•
•
•)
has the
same
distribution
as
(XQ,XI,
•
•
•),
namely
P
ff
,
for
every
n
>
0. In
particular,
X
n
has
distribution
TT
for all
n
>
0. We
will
often
be
concerned
with
the
derivation
of
criteria
for the
existence
of a
unique
invariant
probability
TT,
and
then
{X
n
:
n
=
0,1,2,
•
•
•}
is
ergodic
in the
sense
of
ergodic
theory,
i.e.,
the
cr-field
Fj
of
shift-invariant
events
is
trivial;
that
is
P(B)
=
0 or 1
for
B
e
Fj.
To
describe
an
important
strengthening
of the
Markov
property
described
above,
let
{F
n
'•
n
=
0,1,2,
• • •
}
be an
increasing
sequence
of
sub-cr-fields
of F
such
that
X
n
is
F
n
-
measurable
for
every
n, and the
conditional
distribution
of
X+,
given
F
n
,
is
Px
n
(n
>
0).
Such
a
family
{F
n
:
n
=
0,1,2,
•••}
is
called
a
filtration.
For
example,
one may
take
F
n
—
a
{Xj
'•
0
<
j
<
n}(n
>
0),
or
F
n
may be the
u-field
generated
by
{Xj
:
0
<
j
<
n}
and a
family
of
random
variables
independent
of
{Xj
:
j
>
0}.
A
random
variable
r
:
fi
—>
{0,1,2,
• • •
}
U
{00}
is
said
to be a
{F
n
}-stopping
time
if
{T
<
n}
e
F
n
for
every
n.
Define
the
pre-r
o-field
F
T
by
F
T
:=
{A
£ F
:
A
D
{T
<
n}
e
F
n
Vn}.
It is not
difficult
to
check
that
if
r
is a
stopping
time
then
the
conditional
distribution
of
X+
'•=
(X
T
,
X
T+
i,
• •
•)
given
FT
is
PX
T
on the set
{T
<
00}.
This
property
is
called
the
strong
Markov
property
and it
is
extremely
useful
in
deriving
various
distributions
and
expectations
of
random
variables
related
to the
Markov
process.
We
now
turn
to the
case
of
continuous
parameter
Markov
processes.
Suppose
one is
given
a
family
of
transition
probabilities
p(t;
x,
dy)(t
>
0, x 6 S) on a
state
space
(S, S),
satisfying
(i)
p(t;x,dy)
is a
probability
measure
on
(S,S)
for all t
>
Q,x
e S,
(ii)
x
—>
p(t;x,B)
is
measurable
on (S, S) for all t
>
0, B 6
<5,
and
(iii)
the
following
Chapman-Kolmogorov
equation
holds
?(*•
;x,B)=
I
p(s;
z,
B)p(t;
x,
dz)
(t
>
0, s
>
0,
x
6
S,
B
e
<S)
(1-1-3)
Js
Given
any
initial
distribution
ju,
one can
then
construct
a
probability
measure
P
M
on
(Qo
=
Sl°<°°\F
0
=
S®[°.°°))
as
follows.
Note
that
S^°^
is the set of all
functions
on
[0,oo)
into
S, and
5®!°'°°)
i
s
the
product
cr-field
on
S^
0
'
00
)
generated
by the
coordinate
projections
Xt((jj)
=
uj(t),u>
e
fio-
Define
P^
on
measurable
cylinders
of the
form
A
=
{u
6
QQ
:
u^
6
Bi,i
=
0,l,
,n},Bi£S
(0<i<n),Q<ti<t
2
<
<t
n
,by
P,,(A)
=
I I
'••
I I
X*n
-t
n
-i;x
n
-i,B
n
}p(t
n
-i
-t
n
_
2
;x
n
_
2
,dx
n
_i)
J
BQ
J
BI
J
B-n
—
2
^
B
n
—
i
(1-1-4)
obtained
by
iterated
integration.
In the
case
of a
metric
space
(S,
p)
it is
generally
ad-
vantageous
to
define
such
a
Markov
process
to
have
some
path
regularity.
The
following
results
are due to
Dynkin
[4],
pp.
91-97,
and
Snell
[5].
First,
on a
metric
space
S
define
the
transition
probability
p(t;x,dy),
or a
Markov
process
with
this transition
probability,
to
have
the
Feller
property
if x
—
*
p(t;
x, dy) is
weakly
continuous,
i.e.,
for
every
t
>
0
(T
t
/)(x):=
[
f(y)p(t;x,dy)
(1.1.5)
Js
1.1.
INTRODUCTION
3
is
a
continuous
function
of
x
for
every
bounded
continuous
/.
Theorem
1.1.1
(Dynkin-Snell)
Let
(S,p)
be a
metric
space,
and
p(t;x,dy)
a
transition
probability
on
(S,B(S)).
(a) if
\im-lsupp(t;x,BI(x))}
=
0
Ve
>
0
(B
E
(x)
:=
{y
:
p(y,x)
<
s},
(1.1.6)
t|0
t
then,
for
every
initial
distribution
/j,,
there
exists
a
probability
space
(fl,f,P^)
on
which
is
defined
a
Markov
process
{X
t
:
t
>
0}
with
continuous
sample
paths
and
transition
probability
p(t;
x,
dy)
so
that
(1.4)
holds
for A —
{X
ti
€
Bi
for
i
=
0,
1,
•
•
•
,
n},
0
<
t\
<
ti
<
•
•
•
<
t
n
,
Bi
6 S
=
B(S}(i
=
0,
1,
•
• •
,
n).
(b)
If,
instead
of
(1.6),
one has
lim{supp(t;
x,
B*(x))}
=
0
Ve
>0,
(1.1.7)
no
x£
s
then
one may
find
a
probability
space
(fi,^-",
P
M
)
on
which
is
defined
a
Markov
process
{Xt
:
t
>
0}
which
has
right-continuous
sample
paths
with
left
limits,
having
the
transi-
tion
probability
p(t;
x, dy) and
initial
distribution
\JL.
For
right-continuous
Markov
processes
with
the
Feller
property,
the
strong
Markov
prop-
erty
holds.
To be
precise,
let
{f
t
'•
t
>
0} be a
family
of
increasing
sub-<j-fields
of T
such
that
Xt is
^-(-measurable,
and the
conditional
distribution
of
Xf
:=
{Xt+
s
:
s
>
0}
given
Ti
is
Px
t
(t
>
0). Let T
:
£1
—
>
[0, oo] be a
{.Ft}-
stopping
time,
i.e.,
{r
<
t}
e
ft for
every
t
>
0, and
define
the
pre^r
cr-field
f
T
:={A&f:Ar\{T<t}ef
t
Vt>
0}.
Then
the
strong
Markov
property
requires
that
the
conditional
distribution
of
X^~
'•=
{X
T
+
S
:
s
>
0}
given
f
T
is
PX
T
,
on the set
{T
<
oo}.
It may be
noted
that,
unlike
the
discrete parameter
case,
the
transition
probability
p(t;
x, dy) needed to
construct
a
continuous
parameter
Markov
process
must
be given for
all
t at
least
in a
small
time
interval
(0,
<J],<5
>
0. One may
then construct
p(t;x,dy)
for
all
t
>
0, by the
Chapman-Kolmogorov
equation
(3.1.3).
Thus, except
in
special
cases
such
as for
processes
with
independent
increments,
continuous
parameter transition
prob-
abilities
and
corresponding
Markov
processes
are
constructed
from
infinitesimal
character-
istics.
For
jump Markov
chains
these
characteristics
are the
infinitesimal
transition
rates
qij
:=
limtjo
\p(t'-,
i,j)(i
^
j)-
More
generally,
one
specifies
the
infinitesimal
generator
(1.1.8)
t
J.U
Ti
for
a
suitable class
of
functions
/.
In the
case
of
diffusion
on
K
fe
,
A is a
second
order
elliptic
operator
of the
form
=
1 £
^(x)^^
+
£b
r
(x)^,
(1.1.9)
r,r'
= l r=l
where
b(x)
is the
so-called
drift
velocity,
and
a(x)
the
diffusion
matrix,
of the
process
{X
t
:t>
0}.
Finally,
for a
continuous
parameter
Markov
process
{X
t
:
t
>
0} an
invariant
(initial)
distribution
TT,
if it
exists,
satisfies
p(t;
x,
B)w(dx)
=
7r(B),
Vt
>
0,
B
€ S.
Under
such
an
initial
distribution
TT,
the
process
{Xt
:
t
>
0} is
stationary,
i.e.,
the
distri-
bution
of
X+
:=
{X
i+s
:
s
>
0} is the
same
as
that
of
{X
s
:
s
>
0},
namely
P
n
,
for all
t
>
0.