Tải bản đầy đủ (.pdf) (571 trang)

bernd jahne - practical handbook on image processing for scientific and technical applications

Bạn đang xem bản rút gọn của tài liệu. Xem và tải ngay bản đầy đủ của tài liệu tại đây (28.55 MB, 571 trang )

Practical Handbook on
IMAGE PROCESSING for
SCIENTIFIC and
TECHNICAL APPLICATIONS
SECOND EDITION
© 2004 by CRC Press LLC
CRC PRESS
Boca Raton London New York Washington, D.C.
Practical Handbook on
IMAGE PROCESSING for
SCIENTIFIC and
TECHNICAL APPLICATIONS
Bernd Jähne
University of Heidelberg
SECOND EDITION
© 2004 by CRC Press LLC
This book contains information obtained from authentic and highly regarded sources. Reprinted material is quoted with
permission, and sources are indicated. A wide variety of references are listed. Reasonable efforts have been made to publish
reliable data and information, but the author and the publisher cannot assume responsibility for the validity of all materials
or for the consequences of their use.
Neither this book nor any part may be reproduced or transmitted in any form or by any means, electronic or mechanical,
including photocopying, microÞlming, and recording, or by any information storage or retrieval system, without prior
permission in writing from the publisher.
The consent of CRC Press LLC does not extend to copying for general distribution, for promotion, for creating new works,
or for resale. SpeciÞc permission must be obtained in writing from CRC Press LLC for such copying.
Direct all inquiries to CRC Press LLC, 2000 N.W. Corporate Blvd., Boca Raton, Florida 33431.
Trademark Notice: Product or corporate names may be trademarks or registered trademarks, and are used only for
identiÞcation and explanation, without intent to infringe.
© 2004 by CRC Press LLC
No claim to original U.S. Government works
International Standard Book Number 0-8493-1900-5


Library of Congress Card Number 2004043570
Printed in the United States of America 1 2 3 4 5 6 7 8 9 0
Printed on acid-free paper
Library of Congress Cataloging-in-Publication Data
Jèahne, Bernd, 1953-
Practical handbook on image processing for scientiÞc and technical applications / Berne Jèahne.— 2nd ed.
p. cm.
Includes bibliographical references and index.
ISBN 0-8493-1900-5 (alk. paper)
1. Image processing—Digital techniques—Handbooks, manuals, etc. I. Title.
TA1637.J347 2004
621.36¢7—dc22 2004043570
© 2004 by CRC Press LLC
Visit the CRC Press Web site at www.crcpress.com
Preface
What This Handbook Is About
Digital image processing is a fascinating subject in several aspects. Human beings
perceive most of the information about their environment through their visual sense.
While for a long time images could only be captured by photography, we are now at
the edge of another technological revolution that allows image data to be captured,
manipulated, and evaluated electronically with computers.
With breathtaking pace, computers are becoming more powerful and at the same
time less expensive. Thus, the hardware required for digital image processing is readily
available. In this way, image processing is becoming a common tool to analyze multidi-
mensional scientific data in all areas of natural science. For more and more scientists,
digital image processing will be the key to studying complex scientific problems they
could not have dreamed of tackling only a few years ago. A door is opening for new
interdisciplinary cooperation merging computer science with corresponding research
areas. Thus, there is a need for many students, engineers, and researchers in natural
and technical disciplines to learn more about digital image processing.

Since original image processing literature is spread over many disciplines, it is hard
to gather this information. Furthermore, it is important to realize that image process-
ing has matured in many areas from ad hoc, empirical approaches to a sound science
based on well-established principles in mathematics and physical sciences.
This handbook tries to close this gap by providing the reader with a sound basic
knowledge of image processing, an up-to-date overview of advanced concepts, and a
critically evaluated collection of the best algorithms, demonstrating with real-world
applications. Furthermore, the handbook is augmented with usually hard-to-find prac-
tical tips that will help to avoid common errors and save valuable research time. The
wealth of well-organized knowledge collected in this handbook will inspire the reader
to discover the power of image processing and to apply it adequately and successfully
to his or her research area. However, the reader will not be overwhelmed by a mere
collection of all available methods and techniques. Only a carefully and critically eval-
uated selection of techniques that have been proven to solve real-world problems is
presented.
Many concepts and mathematical tools, which find widespread application in nat-
ural sciences, are also applied to digital image processing. Such analogies are pointed
out because they provide an easy access to many complex problems in digital image
processing for readers with a general background in natural sciences. The author —
himself educated in physics and computer science — merges basic research in digital
image processing with key applications in various disciplines.
This handbook covers all aspects of image processing from image formation to im-
age analysis. Volumetric images and image sequences are treated as a natural extension
of image processing techniques from two to higher dimensions.
III
© 2004 by CRC Press LLC
IV
Prerequisites
It is assumed that the reader is familiar with elementary matrix algebra as well as the
Fourier transform. Wherever possible, mathematical topics are described intuitively,

making use of the fact that image processing is an ideal subject to illustrate even
complex mathematical relations.
transform to the extent required to understand this handbook. This appendix serves
also as a convenient reference to these mathematical topics.
How to Use This Handbook
This handbook is organized by the tasks required to acquire images and to analyze
them. Thus, the reader is guided in an intuitive way and step by step through the chain
of tasks. The structure of most chapters is as follows:
1. A summary page highlighting the major topics discussed in the chapter.
2. Description of the tasks from the perspective of the application, specifying and
detailing what functions the specific image processing task performs.
3. Outline of concepts and theoretical background to the extent that is required to
fully understand the task.
4. Collection of carefully evaluated procedures including illustration of the theoretical
performance with test images, annotated algorithms, and demonstration with real-
world applications.
5. Ready-to-use reference data, practical tips, references to advanced topics, emerg-
ing new developments, and additional reference material. This reference material
is parted into small units, consecutively numbered within one chapter with boxed
numbers, e. g.,
3.1 . The reference item is referred to by this number in the follow-
ing style: 3.1 and 3.3.
individual chapters are written as much as possible in an internally consistent way.
The glossary is unique in the sense that it covers not only image processing in a narrow
sense but all important associated topics: optics, photonics, some important general
terms in computer science, photogrammetry, mathematical terms of relevance, and
terms from important applications of image processing. The glossary contains a brief
definition of terms used in image processing with cross-references to find further infor-
mation in the main text of the handbook. Thus, you can take the glossary as a starting
point for a search on a specific item. All terms contained in the indices are emphasized

by typesetting in italic style.
Acknowledgments
Many of the examples shown in this handbook are taken from my research at Scripps
Institution of Oceanography (University of California, San Diego) and at the Institute
for Environmental Physics and the Interdisciplinary Center for Scientific Computing
(University of Heidelberg). I gratefully acknowledge financial support for this research
from the US National Science Foundation (OCE91-15944-02, OCE92-17002, and OCE94-
09182), the US Office of Naval Research (N00014-93-J-0093, N00014-94-1-0050), and
the German Science Foundation, especially through the interdisciplinary research unit
© 2004 by CRC Press LLC
Exceptions from this organization are only the two introductory Chapters 1 and 2. The
Another key to the usage of the handbook is the detailed indices and the glossary.
Appendix B outlines linear algebra and the Fourier
V
FOR240 “Image Sequence Analysis to Study Dynamical Processes”. I cordially thank
I would also express my sincere thanks to the staff of CRC Press for their constant
interest in this handbook and their professional advice. I am most grateful for the
invaluable help of my friends at AEON Verlag & Studio in proofreading, maintaining
the databases, and in designing most of the drawings.
I am also grateful to the many individuals, organizations, and companies that pro-
vided valuable material for this handbook:
• Many of my colleagues — too many to be named individually here — who worked
together with me during the past seven years within the research unit “Image Se-
quence Analysis to Study Dynamical Processes” at Heidelberg University
• Dr. M. Bock, DKFZ Heidelberg
• Dr. J. Klinke, PORD, Scripps Institution of Oceanography, University of California,
San Diego
• Prof. H G. Maas, Institute of Photogrammetry and Remote Sensing, University of
Dresden
• Prof. J. Ohser, FH Darmstadt

• Dr. T. Scheuermann, Fraunhofer Institute for Chemical Technology, Pfinztal, Ger-
many
• Prof. Trümper, Max-Planck-Institute for Extraterrestric Physics, Munich
• ELTEC Elektronik GmbH, Mainz, Germany
• Dr. Klee, Hoechst AG, Frankfurt, Germany
• Optische Werke G. Rodenstock, Precision Optics Division, D-80469 Munich
• Prof. J. Weickert, University of Saarbrücken, Germany
• Zeiss Jena GmbH, Jena, Germany
• Dr. G. Zinser, Heidelberg Engineering, Heidelberg, Germany
camera test program. I am grateful to the manufacturers and distributors who provided
cameras at no cost: Adimec, Allied Vision, Basler Vision Technologies, IDS, PCO, Pulnix,
and Stemmer Imaging (Dalsa, Jai).
Most examples contained in this handbook have been processed using heurisko®,
a versatile and powerful image processing package. heurisko® has been developed by
AEON
1
in cooperation with the author.
In a rapid progressing field such as digital image processing, a major work like
this handbook is never finished or completed. Therefore, any comments on further
improvements or additions to the handbook are very welcome. I am also grateful for
hints on errors, omissions, or typing errors, which despite all the care taken may have
slipped my attention.
Heidelberg, Germany, January 2004 Bernd Jähne
1
© 2004 by CRC Press LLC
my colleague F. Hamprecht. He contributed the last chapter about classification (Chap-
ter 17) to this handbook.
The detailed description on imaging sensors in Chapter 5 is based on an extensive
AEON Verlag & Studio, Hanau, Germany,
Contents

1 Introduction 1
1.1 Highlights 1
1.2 From Drawings to Electronic Images 2
1.3 Geometric Measurements: Gauging and Counting 3
1.3.1 Size Distribution of Pigment Particles 4
1.3.2 Gas Bubble Size Distributions 4
1.3.3 In Situ Microscopy of Cells in Bioreactors 6
1.4 Radiometric Measurements: Revealing the Invisible 8
1.4.1 Fluorescence Measurements of Concentration Fields 8
1.4.2 Thermography for Botany 11
1.4.3 Imaging of Short Ocean Wind Waves 12
1.4.4 SAR Imaging for Planetology and Earth Sciences 15
1.4.5 X-Ray Astronomy with ROSAT 19
1.4.6 Spectroscopic Imaging for Atmospheric Sciences 19
1.5 Depth Measurements: Exploring 3-D Space 21
1.5.1 Optical Surface Profiling 21
1.5.2 3-D Retina Imaging 24
1.5.3 Distribution of Chromosomes in Cell Nuclei 25
1.5.4 X-Ray and Magnetic Resonance 3-D Imaging 25
1.6 Velocity Measurements: Exploring Dynamic Processes 27
1.6.1 Particle Tracking Velocimetry 27
1.6.2 3-D Flow Tomography 28
1.6.3 Motor Proteins 30
2 Tasks and Tools 33
2.1 Highlights 33
2.2 Basic Concepts 34
2.2.1 Goals for Applications of Image Processing 34
2.2.2 Measuring versus Recognizing 36
2.2.3 Signals and Uncertainty 38
2.2.4 Representation and Algorithms 39

2.2.5 Models 41
2.2.6 Hierarchy of Image Processing Tasks 42
2.3 Tools 45
2.3.1 Overview 45
2.3.2 Camera and Frame Grabber 45
2.3.3 Computer 46
2.3.4 Software and Algorithms 50
VII
© 2004 by CRC Press LLC
VIII Contents
I From Objects to Images
3 Quantitative Visualization 55
3.1 Highlights 55
3.2 Task 56
3.3 Concepts 58
3.3.1 Electromagnetic Waves 58
3.3.2 Particle Radiation 63
3.3.3 Acoustic Waves 64
3.3.4 Radiometric Terms 64
3.3.5 Photometric Terms 67
3.3.6 Surface-Related Interactions of Radiation with Matter 70
3.3.7 Volume-Related Interactions of Radiation with Matter 76
3.4 Procedures 82
3.4.1 Introduction 82
3.4.2 Types of Illumination 82
3.4.3 Illumination Techniques for Geometric Measurements 84
3.4.4 Illumination Techniques for Depth Measurements 86
3.4.5 Illumination Techniques for Surface Slope Measurements . . 88
3.4.6 Color and Multi-Spectral Imaging 96
3.4.7 Human Color Vision 100

3.4.8 Thermal Imaging 103
3.4.9 Imaging of Chemical Species and Material Properties 106
3.5 Advanced Reference Material 108
3.5.1 Classification of Radiation 108
3.5.2 Radiation Sources 110
3.5.3 Human Vision 113
3.5.4 Selected Optical Properties 114
3.5.5 Further References 116
4 Image Formation 119
4.1 Highlights 119
4.2 Task 120
4.3 Concepts 122
4.3.1 Coordinate Systems 122
4.3.2 Geometrical Optics 125
4.3.3 Wave Optics 137
4.3.4 Radiometry of Imaging 140
4.3.5 Linear System Theory 143
4.4 Procedures 147
4.4.1 Geometry of Imaging 147
4.4.2 Stereo Imaging 154
4.4.3 Confocal Laser Scanning Microscopy 159
4.4.4 Tomography 161
4.5 Advanced Reference Material 163
4.5.1 Data of Optical Systems for CCD Imaging 163
4.5.2 Optical Design 166
4.5.3 Further References 166
© 2004 by CRC Press LLC
Contents IX
5 Imaging Sensors 169
5.1 Highlights 169

5.2 Task 169
5.3 Concepts 170
5.3.1 Overview 170
5.3.2 Detector Performance 171
5.3.3 Quantum Detectors 176
5.3.4 Thermal Detectors 176
5.3.5 Imaging Detectors 177
5.3.6 Television Video Standards 180
5.3.7 CCD Sensor Architectures 181
5.4 Procedures 185
5.4.1 Measuring Performance Parameters of Imaging Sensors 185
5.4.2 Sensor and Camera Selection 189
5.4.3 Spectral Sensitivity 191
5.4.4 Artifacts and Operation Errors 192
5.5 Advanced Reference Material 197
5.5.1 Basic Properties of Imaging Sensors 197
5.5.2 Standard Video Signals; Timing and Signal Forms 199
5.5.3 Color Video Signals 201
5.5.4 Cameras and Connectors 204
5.5.5 Further References 205
6 Digitalization and Quantization 207
6.1 Highlights 207
6.2 Task 207
6.3 Concepts 208
6.3.1 Digital Images 208
6.3.2 The Sampling Theorem 213
6.3.3 Sampling Theorem in xt Space 217
6.3.4 Reconstruction from Sampling 218
6.3.5 Sampling and Subpixel Accurate Gauging 220
6.3.6 Quantization 221

6.4 Procedures 226
6.4.1 The Transfer Function of an Image Acquisition System 226
6.4.2 Quality Control of Quantization 228
6.5 Advanced Reference Material 230
6.5.1 Evolution of Image Acquisition Hardware 230
6.5.2 Analog Video Input 232
6.5.3 Digital Video Input 234
6.5.4 Real-Time Image Processing 236
6.5.5 Further References 238
II Handling and Enhancing Images
7 Pixels 241
7.1 Highlights 241
7.2 Task 242
7.3 Concepts 243
7.3.1 Random Variables and Probability Density Functions 243
7.3.2 Functions of Random Variables 246
7.3.3 Multiple Random Variables and Error Propagation 247
© 2004 by CRC Press LLC
X Contents
7.3.4 Homogenous Point Operations 251
7.3.5 Inhomogeneous Point Operations 252
7.3.6 Point Operations with Multichannel Images 253
7.4 Procedures 255
7.4.1 Gray Value Evaluation and Interactive Manipulation 255
7.4.2 Correction of Inhomogeneous Illumination 259
7.4.3 Radiometric Calibration 262
7.4.4 Noise Variance Equalization 263
7.4.5 Histogram Equalization 264
7.4.6 Noise Reduction by Image Averaging 265
7.4.7 Windowing 266

7.5 Advanced Reference Material 267
8 Geometry 269
8.1 Highlights 269
8.2 Task 270
8.3 Concepts 271
8.3.1 Geometric Transformations 271
8.3.2 Interpolation 274
8.4 Procedures 285
8.4.1 Scaling 286
8.4.2 Translation 288
8.4.3 Rotation 288
8.4.4 Affine and Perspective Transforms 290
8.5 Advanced Reference Material 291
9 Restoration and Reconstruction 293
9.1 Highlights 293
9.2 Task 294
9.3 Concepts 294
9.3.1 Types of Image Distortions 294
9.3.2 Defocusing and Lens Aberrations 296
9.3.3 Velocity Smearing 297
9.3.4 Inverse Filtering 297
9.3.5 Model-based Restoration 299
9.3.6 Radon Transform and Fourier Slice Theorem 300
9.4 Procedures 302
9.4.1 Reconstruction of Depth Maps from Focus Series 302
9.4.2 3-D Reconstruction by Inverse Filtering 304
9.4.3 Filtered Backprojection 308
9.5 Advanced Reference Material 311
III From Images to Features
10 Neighborhoods 315

10.1 Highlights 315
10.2 Task 316
10.3 Concepts 317
10.3.1 Masks 317
10.3.2 Operators 319
10.3.3 Convolution 319
10.3.4 Point Spread Function 321
© 2004 by CRC Press LLC
Contents XI
10.3.5 Transfer Function 323
10.3.6 General Properties of Convolution Operators 325
10.3.7 Error Propagation with Filtering 329
10.3.8 Recursive Convolution 331
10.3.9 Rank-Value Filters 337
10.3.10 Strategies for Adaptive Filtering 337
10.4 Procedures 340
10.4.1 Filter Design Criteria 340
10.4.2 Filter Design by Windowing 341
10.4.3 Recursive Filters for Image Processing 344
10.4.4 Design by Filter Cascading 345
10.4.5 Efficient Computation of Neighborhood Operations 347
10.4.6 Filtering at Image Borders 350
10.4.7 Test Patterns 352
10.5 Advanced Reference Material 353
11 Regions 355
11.1 Highlights 355
11.2 Task 356
11.3 Concepts 358
11.3.1 General Properties of Averaging Filters 358
11.3.2 Weighted Averaging 362

11.3.3 Controlled Averaging 362
11.3.4 Steerable Averaging 365
11.3.5 Averaging in Multichannel Images 366
11.4 Procedures 368
11.4.1 Box Filters 368
11.4.2 Binomial Filters 374
11.4.3 Cascaded Multistep Filters 377
11.4.4 Cascaded Multigrid Filters 380
11.4.5 Recursive Smoothing 380
11.4.6 Inhomogeneous and Anisotropic Diffusion 381
11.4.7 Steerable Directional Smoothing 384
11.5 Advanced Reference Material 387
12 Edges and Lines 391
12.1 Highlights 391
12.2 Task 391
12.3 Concepts 392
12.3.1 Edge Models 392
12.3.2 Principal Methods for Edge Detection 394
12.3.3 General Properties 397
12.3.4 Edges in Multichannel Images 399
12.3.5 Regularized Edge Detection 401
12.4 Procedures 403
12.4.1 First-Order Derivation 403
12.4.2 Second-Order Derivation 410
12.4.3 Regularized Edge Detectors 413
12.4.4 LoG and DoG Filter 415
12.4.5 Optimized Regularized Edge Detectors 416
12.5 Advanced Reference Material 417
© 2004 by CRC Press LLC
XII Contents

13 Orientation and Velocity 419
13.1 Highlights 419
13.2 Task 420
13.3 Concepts 421
13.3.1 Simple Neighborhoods 421
13.3.2 Classification of Local Structures 425
13.3.3 First-Order Tensor Representation 428
13.4 Procedures 430
13.4.1 Set of Directional Quadrature Filters 430
13.4.2 2-D Tensor Method 433
13.4.3 Motion Analysis in Space-Time Images 439
13.5 Advanced Reference Material 442
14 Scale and Texture 443
14.1 Highlights 443
14.2 Task 444
14.3 Concepts 446
14.3.1 What Is Texture? 446
14.3.2 The Wave Number Domain 450
14.3.3 Hierarchy of Scales 451
14.3.4 Gaussian Pyramid 454
14.3.5 Laplacian Pyramid 457
14.3.6 Directio-Pyramidal Decomposition 459
14.3.7 Phase and Local Wave Number 460
14.4 Procedures 465
14.4.1 Texture Energy 465
14.4.2 Phase Determination 467
14.4.3 Local Wave Number 469
14.5 Advanced Reference Material 472
IV From Features to Objects
15 Segmentation 475

15.1 Highlights 475
15.2 Task 475
15.3 Concepts 476
15.3.1 Pixel-Based Segmentation 476
15.3.2 Region-Based Segmentation 477
15.3.3 Edge-Based Segmentation 478
15.3.4 Model-Based Segmentation 478
15.4 Procedures 480
15.4.1 Global Thresholding 480
15.4.2 Pyramid Linking 481
15.4.3 Orientation-Based Fast Hough Transformation 484
15.5 Advanced Reference Material 485
16 Size and Shape 487
16.1 Highlights 487
16.2 Task 487
16.3 Concepts 488
16.3.1 Morphological Operators 488
16.3.2 Run-Length Code 493
© 2004 by CRC Press LLC
Contents XIII
16.3.3 Chain Code 494
16.3.4 Fourier Descriptors 496
16.3.5 Moments 499
16.4 Procedures 501
16.4.1 Object Shape Manipulation 501
16.4.2 Extraction of Object Boundaries 503
16.4.3 Basic Shape Parameters 504
16.4.4 Scale and Rotation Invariant Shape Parameters 506
16.5 Advanced Reference Material 507
17 Classification 509

17.1 Highlights 509
17.2 Task 509
17.3 Concepts 510
17.3.1 Statistical Decision Theory 510
17.3.2 Model Optimization and Validation 511
17.4 Procedures 513
17.4.1 Linear Discriminant Analysis (LDA) 513
17.4.2 Quadratic Discriminant Analysis (QDA) 516
17.4.3 k-Nearest Neighbors (k-NN) 517
17.4.4 Cross-Validation 518
17.5 Advanced Reference Material 519
V Appendices
A Notation 523
A.1 General 523
A.2 Image Operators 524
A.3 Alphabetical List of Symbols and Constants 525
B Mathematical Toolbox 529
B.1 Matrix Algebra 529
B.1.1 Vectors and Matrices 529
B.1.2 Operations with Vectors and Matrices 529
B.1.3 Types of Matrices 530
B.2 Least-Squares Solution of Linear Equation Systems 530
B.3 Fourier Transform 532
B.3.1 Definition 532
B.3.2 Properties of the Fourier Transform 533
B.3.3 Important Fourier Transform Pairs 534
B.4 Discrete Fourier Transform (DFT) 534
B.4.1 Definition 534
B.4.2 Important Properties 535
B.4.3 Important Transform Pairs 535

B.5 Suggested Further Readings 536
C Glossary 537
Bibliography 569
D Color Plates 585
© 2004 by CRC Press LLC
1 Introduction
1.1 Highlights
Electronic imaging and digital image processing constitute — after the invention
of photography — the second revolution in the use of images in science and engi-
neering (Section 1.2). Because of its inherently interdisciplinary nature, image
processing has become a major integrating factor stimulating communication
throughout engineering and natural sciences.
For technical and scientific applications, a wide range of quantities can be imaged
and become accessible for spatial measurements. Examples show how appro-
priate optical setups combined with image processing techniques provide novel
measuring techniques including:
Geometric measurements (Section 1.3):
• size distribution of pigment particles and bubbles (Sections 1.3.1 and 1.3.2)
• counting and gauging of cells in bioreactors (Section 1.3.3)
Radiometric measurements (Section 1.4)
• spatio-temporal concentration fields of chemical specimen (Section 1.4.1)
• surface temperature of plant leaves and tumors (Section 1.4.2)
• slope measurements of short ocean wind waves (Section 1.4.3)
• radar imaging in Earth Sciences (Section 1.4.4)
• X-ray satellite astronomy (Section 1.4.5)
• spectroscopic imaging in atmospheric sciences (Section 1.4.6)
Three-dimensional measurements from volumetric images (Section 1.5)
• surface topography measurements of press forms and the human retina (Sec-
tions 1.5.1 and 1.5.2)
• 3-D microscopy of cell nuclei (Section 1.5.3)

• X-ray and magnetic resonance 3-D imaging (Section 1.5.4)
Velocity measurements from image sequences (Section 1.6)
• particle tracking velocimetry for 2-D flow measurements (Section 1.6.1)
• flow tomography for 3-D flow measurements (Section 1.6.2)
• study of motor proteins (Section 1.6.3)
1
© 2004 by CRC Press LLC
2 1 Introduction
a
b
Figure 1.1: From the beginning of science, researchers tried to capture their observations by
drawings. a With this famous sketch, Leonardo da Vinci [1452–1519] described turbulent flow of
water. b In 1613, Galileo Galilei — at the same time as others — discovered the sun spots. His
careful observations of the motion of the spots over an extended period led him to the conclusion
that the sun is rotating around its axis.
1.2 From Drawings to Electronic Images
From the beginning, scientists tried to record their observations in pictures. In the
early times, they could do this only in the form of drawings. Especially remarkable
examples are from Leonardo da Vinci (Fig. 1.1a). He was the primary empiricist of
visual observation for his time. Saper vedere (knowing how to see) became the great
theme of his many-sided scientific studies. Leonardo da Vinci gave absolute precedence
to the illustration over the written word. He didn’t use the drawing to illustrate the
text, rather, he used the text to explain the picture.
Even now, illustrations are widely used to explain complex scientific phenomena
and they still play an important role in descriptive sciences. However, any visual ob-
servation is limited to the capabilities of the human eye.
The invention of photography triggered the first revolution in the use of images
for science. The daguerretype process invented by the French painter Jaque Daguerre
in 1839 became the first commercially utilized photographic process. Now, it was
possible to record images in an objective way. Photography tremendously extended

the possibilities of visual observation. Using flash light for illumination, phenomena
could be captured that were too fast to be recognized by the eye. It was soon observed
that photographic plates are also sensitive to non-visible radiation such as ultraviolet
light and electrons. Photography played an important role in the discovery of X-rays
by Wilhelm Konrad Röntgen in 1895 and led to its widespread application in medicine
and engineering.
However, the cumbersome manual evaluation of photographic plates restricted the
quantitative analysis of images to a few special areas. In astronomy, e. g., the position
© 2004 by CRC Press LLC
1.3 Geometric Measurements: Gauging and Counting 3
Figure 1.2: Sciences related to image processing.
and brightness of stars is measured from photographic plates. Photogrammetrists
generate maps from aerial photographs. Beyond such special applications, images
have been mostly used in science for qualitative observations and documentation of
experimental results.
Now we are experiencing the second revolution of scientific imaging. Images can be
converted to electronic form, i. e., digital images, that are analyzed quantitatively using
computers to perform exact measurements and to visualize complex new phenomena.
This second revolution is more than a mere improvement in techniques. Tech-
niques analogous to the most powerful human sense, the visual system, are used to
get an insight into complex scientific phenomena. The successful application of image
processing techniques requires an interdisciplinary approach. All kinds of radiation
and interactions of radiation with matter can be used to make certain properties of
objects visible. Techniques from computer science and mathematics are required to
perform quantitative analyses of the images. Therefore, image processing merges a
surprisingly wide range of sciences (Fig. 1.2): mathematics, computer science, electri-
cal and mechanical engineering, physics, photogrammetry and geodesy, and biological
sciences. As actually no natural science and no engineering field is excluded from ap-
plications in image processing, it has become a major integrating factor. In this way, it
triggers new connections between research areas. Therefore, image processing serves

as an integrating factor and helps to reverse the increasing specialization of science.
This section introduces typical scientific and technical applications of image process-
ing. The idea is to make the reader aware of the enormous possibilities of modern vi-
sualization techniques and digital image processing. We will show how new insight is
gained into scientific or technical problems by using advanced visualization techniques
and digital image processing techniques to extract and quantify relevant parameters.
We briefly outline the scientific issues, show the optical setup, give sample images,
describe the image processing techniques, and show results obtained by them.
1.3 Geometric Measurements: Gauging and Counting
Counting of particles and measuring their size distribution is an ubiquitous image
processing task. We will illustrate this type of technique with three examples that also
demonstrate that a detailed knowledge of the image formation process is required to
make accurate measurements.
© 2004 by CRC Press LLC
4 1 Introduction
a
500 nm
b c
Figure 1.3: Electron microscopy image of color pigment particles. The crystalline particles tend
to cluster; b and c are contour plots from the areas marked in a . Images courtesy of Dr. Klee,
Hoechst AG, Frankfurt.
1.3.1 Size Distribution of Pigment Particles
The quality and properties of paint are largely influenced by the size distribution of
the coloring pigment particles. Therefore, a careful control of the size distribution
is required. The particles are too small to be imaged with standard light microscopy.
Two transmission electron microscopy images are shown in Fig. 1.3. The images clearly
demonstrate the difficulty in counting and gauging these particles. While they separate
quite well from the background, they tend to form clusters (demonstrated especially
by Fig. 1.3b). Two clusters with overlaying particles are shown in Fig. 1.3b and c as
contour plots. Processing of these images therefore requires several steps:

1. Identify non-separable clusters
2. Identify overlaying particles and separate them
3. Count the remaining particles and determine their size
4. Compute the size distribution from several images
5. Estimate the influence of the clusters and non-separable particles on size distribu-
tion
1.3.2 Gas Bubble Size Distributions
Bubbles are submerged into the ocean by breaking waves and play an important role in
various small-scale air-sea interaction processes. They form an additional surface for
the exchange of climate-relevant trace gases between the atmosphere and the ocean,
are a main source for marine aerosols and acoustic noise, and are significant for the
dynamics of wave breaking. Bubbles also play an important role in chemical engineer-
ing. They form the surface for gas-liquid reactions in bubble columns where a gas is
bubbling through a column through which a liquid is also flowing.
to visualize air bubbles. The principle is illustrated in Fig. 1.4b. The bubbles are ob-
Although this sounds like a very simple application, only careful consideration of
details led to a successful application and measurement of bubble size distributions.
It is obvious that with this technique the sampling volume is not well defined. This
problem could be resolved by measuring the degree of blurring, which is proportional
Figure 1.4a shows an underwater photograph of the light blocking techniques used
served as more or less blurred black ellipses (Fig. 1.5a).
© 2004 by CRC Press LLC
1.3 Geometric Measurements: Gauging and Counting 5
a
b
shadow
camera lensillumination source
scattered light
incident light
air

bubble
Figure 1.4: a Underwater photography of the optical setup for the measurements of bubble size
distributions. The light source (on the left) and the optical receiver with the CCD camera (on the
right) are mounted 5 to 20 cm below the water’s surface. The measuring volume is located in
the middle of the free optical pass between source and receiver and has a cross-section of about
6 ×8 mm
2
. b Visualization principle: bubbles scatter light away from the receiver for almost the
entire cross section and thus appear as black circles (Fig. 1.5).
a
1 mm
Figure 1.5: a Four example images of bubbles that can be observed as more or less blurred black
ellipses; b size distribution of bubbles measured with the instrument shown in Fig. 1.4 at wind
speeds of 14 ms
−1
(black circles) and 11 ms
−1
(gray circles) and 5 cm below the mean water surface
in the wind/wave flume of Delft Hydraulics, The Netherlands. The results are compared with
earlier measurements in the Marseille wind/wave flume using a laser light scattering technique.
to the distance of the bubble from the focal plane [69, 70]. Such a type of technique is
called depth from focus and only one image is required to measure the distance from
the focal plane. This approach has the additional benefit that the depth of the sam-
pling volume is proportional to the radius of the bubbles. In this way, larger bubbles
that occur much more infrequently can be measured with significantly better count-
ing statistics. Figure 1.5b shows a sample of a size distribution measured in the large
wind/wave flume of Delft Hydraulics in The Netherlands.
© 2004 by CRC Press LLC
b
0.05

0.1
0.2
0.5
1
10.
100.
1000.
10000.
100000.
d710
d706
bubble density [um-1 m-3]
radius [mm]
Marseille, 14 m/s
Marseille, 13 m/s
Delft, 11 m/s
Delft, 14 m/s
6 1 Introduction
a
b
Laser
Laser
control
Beam ex-
pander
with
pinhole
Micrometer
screw
Filter

Intensified
camera
To monitor
and image
processing
Reactor
bottom
Objective
Beam
splitter
Culture
medium
Focal plane
c d
Figure 1.6: a Experimental bioreactor at the university of Hannover used to test the new in
situ microscopy technique for cell counting. The laser can be seen in the lower right of the
image, the intensified camera at the lower left of the camera below the bioreactor. b Sketch
of the optical setup for the in situ microscopy. The illumination with a pulsed nitrogen laser is
applied via a beam splitter through the objective of the microscope. c Example cell images, made
visible by excitation of the metabolism-related NADH/NADPH fluorescence with a nitrogen laser;
d improved by adaptive filtering. From Scholz [125].
1.3.3 In Situ Microscopy of Cells in Bioreactors
A similar technique as for the measurement of bubble size distributions can also be
used to measure the concentration of cells in bioreactors in situ. Conventional off-line
measuring techniques require taking samples to measure cell concentrations by stan-
dard counting techniques in the microscope. Flow cytometry has the disadvantage that
probes must be pumped out of the bioreactor, which makes the sterility of the whole
setup a much more difficult task. A new in-situ microscopy technique uses a standard
flange on the bioreactor to insert the microscope so that contamination is excluded.
This technique that is suitable for an in situ control of fermentation processes has

been developed in cooperation between the ABB Research Center in Heidelberg, the
© 2004 by CRC Press LLC
1.3 Geometric Measurements: Gauging and Counting 7
a
02468
0
50
100
150
200
250
300
350
Off line method
Depth from Focus method
Cell concentration [10
6
/ml]
Time [h]
b
0 5 10 15 20
0
100
200
300
400
500
600
Off line method
Depth from Focus method

Cell concentration [10
6
/ml]
Time [h]
c
0 5 10 15 20
0
5
10
15
20
Mean cell size [ mµ
2
]
Time [h]
d
24681012
25
30
35
40
45
50
55
60
Mean intensity [gray value]
Zeit [h]
Figure 1.7: Fermentation processes in a bioreactor as they can be observed with in situ mi-
croscopy. a Fed-batch culture (continuous support of nutrition after the culture reaches the sta-
tionary phase). b Batch culture fermentation (no additional nutrition during the fermentation).

c Decrease of the mean cell size as observed in the fermentation shown in a. d Temporal change
of the mean intensity of the NADH/NADPH fluorescence as observed during the fermentation
shown in b. From Scholz [125].
Institute for Technical Chemistry at the University of Hannover, and the Institute for
Environmental Physics at Heidelberg University [141].
The cells are made visible by stimulation of the
NADH/ NADPH fluorescence using a nitrogen laser. In this way, only living cells are
measured and can easily be distinguished from dead ones and other particles. Unfortu-
nately, the NADH/NADPH fluorescence is very weak. Thus, an intensified CCD camera
is required and the cell images show a high noise level (Fig. 1.6c).
With this high noise level it is impossible to analyze the blurring of the cells for
a precise concentration determination. Therefore, first an adaptive smoothing of the
images is applied that significantly reduces the noise level but doesn’t change the steep-
ness of the edges (Fig. 1.6d). This image material is now suitable for determining the
degree of blurring by using a multigrid algorithm and the distance of the cell to the
focal plane on a Laplacian pyramid (Section 14.3.5).
The cell concentrations determined by the in situ microscopy compare well with
off-line cell counting techniques (Fig. 1.7a and b). The technique delivers further in-
formation about the cells. The mean cell size can be determined and shows a decrease
Figure 1.6a, b shows the setup.
© 2004 by CRC Press LLC
8 1 Introduction
1
23
4
5
t[s]
1
2
3

z
z
[mm]
[mm]
0
1
2
water
surface
Figure 1.8:
A 5 second time series of vertical concentration profiles of gases dissolved in water right at the
interface as measured in the circular wind wave flume at the Institute for Environmental Physics
at Heidelberg University. Traces of HCl gas were injected into the air space of the facility. The
absorption of HCl at the water surface induces the transport of a fluorescent pH indicator across
the aqueous boundary layer giving rise to lower concentration levels of the fluorescent dye to-
wards the water surface. The wind speed was 2.4 m/s. The lower image shows the same sequence
of vertical profiles but in a coordinate system moving with the water surface.
into the
metabolism. After an initial exponential increase of the cell concentration dur-
ing the first six hours of the fermentation (Fig. 1.7a), the growth stagnates because all
the glucose has been used up. At this point, the NADP fluorescence decreases suddenly
(Fig. 1.7d) as it is an indicator of the metabolic activity. After a while, the cells become
adapted to the new environment and start burning up the alcohol. Growth starts again
and the fluorescence intensity comes back to the original high level.
1.4 Radiometric Measurements: Revealing the Invisible
The radiation received by an imaging sensor at the image plane from an object reveals
some of its features. While visual observation by the human eye is limited to the
portion of the electromagnetic spectrum called light, imaging sensors are now available
for almost any type of radiation. This opens up countless possibilities for measuring
object features of interest.

1.4.1 Fluorescence Measurements of Concentration Fields
Here we describe two imaging techniques where concentration fields of chemical species
are measured remotely by using fluorescence. The first example concerns the exchange
of gases between the atmosphere and the ocean, the second the metabolism of en-
dolithic cells, which live within the skeleton of corals.
The exchange of climate relevant gases such as CO
2
, methane, or fluorocarbons
between the atmosphere and the ocean are controlled by a thin layer at the water
surface, the so-called aqueous viscous boundary layer. This layer is only 30 to 300 µm
thick and dominated by molecular diffusion. Its thickness is determined by the degree
of turbulence at the ocean interface.
during the course of the fermentation process (Fig. 1.7c). One also gets a direct insight
© 2004 by CRC Press LLC
1.4 Radiometric Measurements: Revealing the Invisible 9
Figure 1.9: Comparison of the mean concentration profiles computed from sequences of profiles
[105].
A new visualization technique was developed to measure the concentration of gases
dissolved in this layer with high spatial and temporal resolution. The technique uses a
chemically reactive gas (such as HCl or NH
3
) and laser-induced fluorescence (LIF ) [67].
Although the technique is quite complex in detail, its result is simple. The measured
fluorescence intensity is proportional to the concentration of a gas dissolved in water.
Figure 1.8 shows time series of vertical concentration profiles. The thin either bright
(NH
3
) or dark (HCl) layer indicates the water surface undulated by waves. Because of
the total reflection at the water surface, the concentration profiles are seen twice. First
directly below the water surface and second as a distorted mirror image apparently

above the water surface. Image processing techniques are used to search the extreme
values in order to detect the water surface. Then, the concentration profiles can directly
be drawn as a function of the distance to the water surface (Fig. 1.8). Now, it can clearly
be seen that the boundary layer thickness shows considerable fluctuations and that
part of it is detached and transported down into the water bulk. For the first time,
a technique is available that gives direct insight into the mechanisms of air-sea gas
transfer. Mean profiles obtained by averaging (Fig. 1.9) can directly be compared with
profiles computed with various models.
The optical measurement of oxygen is based on the ability of certain dyes or lu-
minophores to change their optical properties corresponding to a change of concen-
tration of the analyte - oxygen. These indicators can be incorporated in polymers and
easily spread on transparent support foils allowing the 2D measurement of oxygen and
With a special measuring system, called modu-
lar luminescence lifetime imaging system (MOLLI), it is possible to use the "delayed"
luminescence for the oxygen measurement and white light illumination for structural
images. The use of the decaying luminescence (light emitted when the excitation light
source is switched off) is possible with a special CCD camera with a fast electronical
shutter and additional modulation input (sensicam sensimod, PCO AG), if the lumines-
cence decay times are in the range of µs or larger.
In the presented application, the light dependent metabolism of endolithic cells,
which live within the skeleton of massive corals, was investigated. These cells usually
see only minimum amounts of light, since most of the light is absorbed in the surface
layer of the coral by the coral symbionts. Therefore the oxygen production within the
skeleton was measured in relation to various illumination intensities. One result is
as shown in Fig. 1.8. The measured profile is compared with various theoretical predictions. From
the "look through" (Fig. 1.10c and d).
© 2004 by CRC Press LLC
012345
0.0
0.1

0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1.0
surface renewal n=2/3
small eddy n=2/3
surface renewal n=1/2
small eddy n=1/2
mean fluorescein profiles:
3.7 m/s, run 1
3.7 m/s, run 2
normalized
concentration
C /

C
normalized depth z/z
*
10 1 Introduction
a
b
c
d
Figure 1.10: Optical measurement of oxygen in corals: a Sample place for the coral, the lagoon
of Heron Island, Capricorn Islands, Great Barrier Reef, Australia. b Set-up with the coral in the

glass container placed on top of the LED frame while the measuring camera was looking at the
cut coral surface from below. c Close up of the set-up with a cut coral porite placed on top of a
transparent planar oxygen optode, which was fixed at the bottom of a glass container, filled with
lagoon sea water. The blue light is the excitation light, coming from blue LEDs arranged in a
frame below the glass, the orange-red light corresponds to the excited luminescence of the optode;
The fixed white tube (upper right part) served for flushing the water surface with a constant air
stream to aerate the water. d Measured 2D oxygen distribution (view to cut coral surface) given
in % air saturation. The oxygen image is blended into a grayscale image from the coral structure.
The high oxygen values were generated by endolithic cells, which live within the coral skeleton.
The production was triggered by a weak illumination through the oxygen sensor, which simulated
the normal light situation of these cells at daylight in the reef. Images courtesy of Dr. Holst, PCO
AG, Kelheim, Germany. (See also Plate 1.)
shown in the Fig. 1.10d. Cleary the oxygen production by the ring of endolithic cells
can be seen. The coral was sampled in the lagoon of Heron Island (Capricorn Islands,
Great Barrier Reef, Australia) and the experiments were made at Heron Island Research
Station in a cooperation between the Max Planck Institute for Marine Microbiology,
Peter Ralph, Department of Environmental Sciences, University of Technology, Sydney,
Microsensor Research Group, Bremen, Germany and Dr.
Australia ( />© 2004 by CRC Press LLC
1.4 Radiometric Measurements: Revealing the Invisible 11
a
b
Figure 1.11: Application of thermography in botanical research. a Infrared image of a ricinus
leaf. The patchy leaf surface indicates that evaporation and, therefore, opening of the stomata is
not equally distributed over the surface of the leaf. b The fast growth of a plant tumor destroys
the outer waxen cuticle which saves the plant from uncontrolled evaporation and loss of water. In
the temperature image, therefore, the tumor appears significantly cooler (darker) than the trunk
of the plant. It has about the same low temperature as the wet blurred sponge in the background
of the image. The images are from unpublished data of a cooperation between U. Schurr and M.
Stitt from the Institute of Botany of the University of Heidelberg and the author.

1.4.2 Thermography for Botany
This section shows two interesting interdisciplinary applications of thermography in
botanical research. With thermography, the temperature of objects at environmental
temperatures can be measured by taking images with infrared radiation in the 3-5 µm
or 8-14 µm wavelength range.
1.4.2a Patchiness of Photosynthesis. Figure 1.11a shows the surface temperature
of a ricinus leaf as measured by an infrared camera. These cameras are very sensitive.
The one used here has a temperature resolution of 0.03 ° C. The temperature of the
leaf is not constant but surprisingly shows significant patchiness. This patchiness is
caused by the fact that the evaporation is not equally distributed over the surface of the
leaf. Evaporation of water is controlled by the stomata. Therefore, the thermographic
findings indicate that the width of the stomata is not regular. As photosynthesis is
controlled by the same mechanism via the CO
2
exchange rate, photosynthesis must
have a patchy distribution over the surface of the leaf. This technique is the first direct
proof of the patchiness of evaporation. So far it could only be concluded indirectly by
investigating the chlorophyll fluorescence.
1.4.2b Uncontrolled Evaporation at Tumor Surfaces. Another interesting phenom-
enon yielded by infrared thermography can be seen in Fig. 1.11b. The tumor at the trunk
of the ricinus plant is significantly cooler than the trunk itself. The tumor has about
the same low temperature as a wet sponge, which is visible blurry in the background
of the image. From this observation it can be concluded that the evaporation at the
tumor surface is unhindered with a rate similar to that of a wet sponge. As a signifi-
© 2004 by CRC Press LLC
12 1 Introduction
2
1
5
4

3
6
6
8
7
7
9
8
7
6
5
4
3
2
1
9
Young anemometer
GPS antenna
RF antenna
wind vane
camera tube
floatation
battery boxes
LED light box
computer box
Figure 1.12: Instrumentation to take image sequences of the slope of short wind waves at the
ocean interface. Left: The wave-riding buoy with a submerged light source and a camera at the
top of the buoy which observes an image sector of about 15 ×20 cm
2
. Right: Buoy deployed

from the research vessel New Horizon. Almost all parts of the buoy including the light source are
submerged. From Klinke and Jähne [79].
cant negative effect of the tumor, the plant loses substantial amounts of water through
the tumor surface. Normally, a plant is protected from uncontrolled evaporation by a
waxen layer (cuticle) and the evaporation mainly occurs at the stomata of the leaves.
1.4.3 Imaging of Short Ocean Wind Waves
Optical measuring techniques often must be used in hostile environments. One of the
most hostile environments is the ocean, especially for measurements close to the ocean
surface. Nevertheless, it is possible to use imaging optical techniques there.
Recently, short ocean wind waves (“ripples”) have become a focus of interest for
scientific research. The reason is their importance for modern remote sensing and
significant influence on small-scale exchange processes across the ocean/atmosphere
interface. Short ocean wind waves are essentially the scatters that are seen by modern
remote sensing techniques from satellites and airplanes using micro waves. In under-
standing the role of these waves on the electromagnetic backscatter in the microwave
range, it is required to know the shape and spatio-temporal properties of these small-
scale features at the ocean surface.
Measurements of this kind have only recently become available but were limited to
measurements in simulation facilities, so-called wind-wave flumes. Figure 1.12 shows
a sketch of a new type of wave-riding buoy that was specifically designed to measure
© 2004 by CRC Press LLC

×