Share Email Print

Spie Press Book

Image Acquisition and Preprocessing for Machine Vision Systems
Author(s): P. K. Sinha
Format Member Price Non-Member Price

Book Description

Machine vision comprises three integrated processes: acquisition, preprocessing, and image analysis. While many resources discuss application-specific image analysis, there has been no unified account of image acquisition hardware and preprocessing--until now.

Image Acquisition and Preprocessing for Machine Vision Systems is a comprehensive, exhaustive reference text detailing every aspect of acquisition and preprocessing, from the illumination of a scene to the optics of image forming, from CCD and CMOS image capture to the transformation of the captured image.

This book bridges the gaps between hardware and software on one hand and theory and applications on the other. With its detailed coverage of imaging hardware and derivations of preprocessing kernels, it is an invaluable design reference for students, researchers, application and product engineers, and systems integrators.

Distinctive features of the book include:
• Detailed theories of CCD and CMOS image sensors, image formation, scene illumination, and camera calibration.
• Unique combination of operational details of imaging hardware (front-end electronics) and analytical theories of low-level image-processing functions.
• Coverage of image-acquisition modules and preprocessing functions within a unified framework.
• Derivation of 2D image-processing functions from first principles by extending 1D signal-processing concepts.


Book Details

Date Published: 25 January 2012
Pages: 750
ISBN: 9780819482020
Volume: PM197

Table of Contents
SHOW Table of Contents | HIDE Table of Contents

1. IMAGE FORMATION
1.1 Introduction
1.2 Preprocessing
1.3 Analysis and Measurement
1.4 Overview of Text
References

2. HUMAN VISION
2.1 Sources of Light
2.2 The Human Eye
2.3 Stimulus Measurement
2.4 Brightness Thresholds
      2.4.1 Absolute threshold
      2.4.2 Differential threshold
      2.4.3 Adaptation
2.5 Contrast
2.6 Visual acuity
2.7 Flicker
2.8 Spatio-Temporal Effects
References

3. IMAGE-FORMING OPTICS
3.1 Optical Glass
3.2 Geometrical Optics
3.3 Lens Equations
      3.3.1 Simple thin lens
      3.3.2 Compound thin lens
      3.3.3 Thick lens
            3.3.3.1 Ball lens
            3.3.3.2 Cylindrical lens
            3.3.3.3 Condenser lens
            3.3.3.4 Fresnel lens
            3.3.3.5 Micro lenses
            3.3.3.6 Extension tube
3.4 Aperture Stop, f -Number, and Speed
3.5 Focusing and Depth of Field
3.6 Resolving Power
3.7 Aberration
      3.7.1 Monochromatic aberrations
            3.7.1.1 Spherical aberrations
            3.7.1.2 Astigmatism
            3.7.1.3 Coma
            3.7.1.4 Field curvature
            3.7.1.5 Distortion
      3.7.2 Chromatic aberrations
3.8 Optical Coatings
3.9 Optical Filters
      3.9.1 Absorption filters
      3.9.2 Interference (bandpass) filters
3.10 Plastic lens
References

4. SCENE ILLUMINATION
4.1 Radiant Sources
4.2 Types of Illuminators
4.3 Optical Properties of Targets
      4.3.1 Reflective materials
      4.3.2 Transmissive materials
      4.3.3 Absorptive materials
4.4 Lighting Methods
      4.4.1 Front lighting
      4.4.2 Backlighting
      4.4.3 Specular illumination
      4.4.4 Beamsplitter and split mirror
      4.4.5 Retroreflector
      4.4.6 Structured lighting
4.5 Polarization of Light
4.6 Fiber Optic Lighting
      4.6.1 Light gathering
      4.6.2 Transmission characteristics
References

5. IMAGE SENSORS
5.1 Photogeneration
      5.1.1 Critical wavelength
      5.1.2 Absorption coefficient
5.2 Photoconductor
5.3 Photodiode
5.4 CMOS Image Sensor
5.5 Metal-Oxide Gate
5.6 Charge-Coupled Devices
      5.6.1 Charge-transfer efficiency
      5.6.2 Blooming
      5.6.3 Illumination geometry
5.7 Line-Scan Imaging
5.8 Area-Scan Imaging
5.9 Related Topics
      5.9.1 Pixel size
      5.9.2 Color-filter array
      5.9.3 Noise
      5.9.4 CMOS versus CCD
      5.9.5 Scan pattern
      5.9.6 Pixel ordering
      5.9.7 Incident photons
Appendix 5A: Semiconductor Properties
      5A.1 Optical Materials
      5A.2 Doping of Semiconductors
      5A.3 Carrier Generation
      5A.4 Optical Sensors
      5A.5 Semiconductor Terminology
References

6. IMAGING HARDWARE
6.1 Image Display
6.2 Liquid Crystal Display
6.3 Photodiode
      6.3.1 Analog front end
      6.3.2 Timing pulses
      6.3.3 Pixel clock
      6.3.4 Gray-level digitization
      6.3.5 Look-up table
      6.3.6 Image store
      6.3.7 Dedicated processor
      6.3.8 Video sampling frequency
6.4 Latency Parameters
      6.4.1 Capture latency
      6.4.2 Transfer latency
      6.4.3 Effects of latency
6.5 Resolution
      6.5.1 Gray-level resolution
      6.5.2 Pixel resolution
      6.5.3 Spatial resolution
      6.5.4 Assessment of resolution
References

7. IMAGE FORMATION
7.1 Field of View
7.2 Depth of Field
7.3 Image Intensity
7.4 Image Functions
      7.4.1 Point-spread function
      7.4.2 Line-spread function
      7.4.3 Edge-spread function
      7.4.4 Optical transfer function
      7.4.5 MTF and contrast
7.5 Image Modeling
      7.5.1 Wavefront model
      7.5.2 Diffraction
7.6 Lens MTF
      7.6.1 Resolution
      7.6.2 Image quality
7.7 Sensor MTF
      7.7.1 Spatial MTF
      7.7.2 Diffusion MTF
      7.7.3 Charge-transfer MTF
References

8. CAMERA CALIBRATION
8.1 Projection
8.2 Ideal Intrinsic Model
8.3 Extrinsic Model
      8.3.1 Translation
      8.3.2 Rotation
8.4 General Camera Model
8.5 Tsai Calibration
      8.5.1 Camera model
      8.5.2 Scaling and origin transfer
      8.5.3 Stage 1 calibration: Parameters embedded in image abscissa
      8.5.4 Stage 2 calibration: Parameters related to image ordinate
      8.5.5 Resolution and distortion
8.6 Stereo Imaging
      8.6.1 Epipolar geometry
      8.6.2 Matching with epipolar constraints
8.7 Feature Matching
      8.7.1 Intensity matching
      8.7.2 Cross-correlation
      8.7.3 Edge feature
8.8 Inclined Camera
      8.8.1 Viewing direction
      8.8.2 Scaling factors
References

9. GRAY-LEVEL TRANSFORMATION
9.1 Pixel-to-Pixel Mapping
9.2 Gamma Correction
9.3 Image Histogram
9.4 Histogram Equalization
9.5 Histogram Hyperbolization
9.6 Histogram Specification
9.7 Local Histogram
9.8 Statistical Differencing
9.9 Thresholding
      9.9.1 Triangular minimum method
      9.9.2 Iterative mean method
9.10 Co-occurrence Matrix
Appendix 9A: Histogram Properties
      9A.1 Definitions
      9A.2 Variate Transformation
References

10. SPATIAL TRANSFORMATION
10.1 Interpolation
10.2 Geometric Operations
      10.2.1 Forward transformation
      10.2.2 Backward transformation
      10.2.3 Rescaling Cartesian coordinates
10.3 Bilinear Interpolation
10.4 Cubic Interpolation
10.5 Zero-Order Convolution
10.6 Affine Transformation
10.7 Perspective Transformation
Appendix 10A: Basis Functions and Splines
      10A.1 Hermite Curves
      10A.2 Cardinal Splines
      10A.3 Bezier Curves
      10A.4 Cubic Splines
References

11. SPATIAL FILTERING
11.1 Noise Models
11.2 Averaging Filters
      11.2.1 Gaussian filter
      11.2.2 Rotating average
      11.2.3 Sigma filter
      11.2.4 Outlier filter
      11.2.5 Unsharp mask
11.3 Rank-Order Filters
11.4 Adaptive Filters
      11.4.1 Additive noise
      11.4.2 Impulse noise
      11.4.3 Multiplicative noise
11.5 First-Order Gradients
      11.5.1 Roberts operator
      11.5.2 Prewitt operator
      11.5.3 Sobel operator
11.6 Second-Order Gradients
11.7 Anisotropic Filters
      11.7.1 Bilateral filters
      11.7.2 Diffusion filters
Appendix 11A: Convolution Kernals
      11A.1 Discrete Convolution
      11A.2 Two-Dimensional Convolution
References

12. DISCRETE FOURIER TRANSFORM
12.1 Discrete Fourier Series
12.2 Discrete Fourier Transform
12.3 Decimation-in-Time FFT
12.4 Image Frequency Spectrum
12.5 Basis Function
12.6 Matrix Form
      12.6.1 DIT in matrix form
      12.6.2 DIF in matrix form
Appendix 12A: DFT Through Decomposition
      12A.1 Eight-Point DFT with Radix 2
      12A.2 Decimation in Frequency
      12A.3 Mixed Radix
References

13. SPATIAL FREQUENCY FILTERS
13.1 Fourier Series Finite Sum
13.2 DFT Computation
13.3 Spectrum Display
      13.3.1 Origin shifting
      13.3.2 Amplitude scaling
      13.3.3 Frequency scaling
13.4 Ideal Filters
13.5 Butterworth Filter
      13.5.1 Low-pass Butterworth filter
      13.5.2 High-pass Butterworth filter
13.6 Gaussian Filter
      13.6.1 Low-pass Gaussian filter
      13.6.2 High-pass Gaussian filter
13.7 Homomorphic Filter
13.8 Image Restoration
References

14. REVIEW OF IMAGE PARAMETERS
14.1 Image Contrast
14.2 Lens Resolution
14.3 Sensor Resolution
14.4 Aliasing
14.5 Image Display
14.6 Image Printing
14.7 File Format
14.8 Bibliographical Notes
References

APPENDIX A: FOURIER TRANSFORMATION
A.1 Fourier Series
A.2 Extension to Nonperiodic Wavetrain
A.3 Commonly Used Functions
      A.3.1 Rectangular and delta functions
      A.3.2 Sinc function
      A.3.3 Comb function
      A.3.4 Fourier transforms
A.4 2D Fourier Transform
A.5 Fourier Transform of Images
References

APPENDIX B: SAMPLING
B.1 Introduction
B.2 Frequency Folding
B.3 Spatial Foldover
B.4 Reconstruction
B.5 2D Sampling
References

APPENDIX C: DISCRETE FOURIER TRANSFORM
References

APPENDIX D: TIME-FREQUENCY MAPPING
D.1 Introduction
D.2 Z Transform
D.3 Transfer Function
D.4 Pulse Transfer Function
D.5 Digitization of Transfer Functions
D.6 Analog Filter
D.7 Digital Filter
D.8 Frequency Warping
References

Index

Preface

From an applications point of view, machine vision refers to the recovery of quantitative data from digital images. The setup for such recovery tasks requires hardware for image sensing and storage, and preprocessing software to convert captured images into image data. From an end-user's perspective, a machine vision system consists of three functionally cognate subsystems: acquisition, preprocessing, and application-specific analysis and measurement software. This book covers the first two subsystems by presenting some of the fundamental principles and characteristics of front-end hardware and derivations of a core set of preprocessing functions. Examples are included primarily to illustrate the use of some preprocessing functions rather than to provide an account of specific applications. I have taken this approach because algorithms and software for the third subsystem are application specific, and the details of many of those applications are readily available. In contrast, a unified account of image acquisition hardware and preprocessing functions is not available in any comparable literature.

In selecting the contents for this book, I excluded several areas associated with image processing, such as mathematical morphology, feature detection, shape recognition, and texture analysis, and I give only an outline description of correlation, moments, and the Hough transform. All of these topics are well covered in several other textbooks. Instead, I chose to provide in-depth coverage of the topics tied to image capture and spatial- and frequency-domain preprocessing functions for readers who are migrating to the machine vision field from other areas, as well as for practicing engineers who are seeking a conceptual account of front-end electronics and the roots of preprocessing algorithms.

The increasing degree of "smartness" of plug-and-play cameras and framegrabbers allows many preprocessing operations to be performed using default settings. However, the integration of an image-based measurement system, with emphasis on smaller memory space and shorter execution time, requires a higher level of awareness of design principles and associated performance parameters. With this context, the book covers principles related to the intrinsic characteristics of captured images, the hardware aspects of image signal generation, and the mathematical concepts of image signal processing from an algorithmic perspective of developing preprocessing software. In addition to bridging the hardware-to-software gap, this book provides a basis to identify some of the key design parameters and potential interface or processing limitations at an early stage of application development. In this respect, topics covered in the book are suitable for students and researchers as well as for a wide spectrum of end users, application development engineers, and system integrators from both the image processing and machine vision communities.

In building an algorithmic framework for preprocessing tasks, I adopted an approach akin to mathematical modeling and signal analysis to provide a conceptual understanding of the basic principles and their relationship to image acquisition parameters. Most of the hardware modules and preprocessing functions covered in this book are underpinned by an extensive collection of models and derivations. Other than providing insight to the design features of the front-end components (optics, sensors, and interface), this mathematical framework helps to (1) highlight some of the underlying assumptions in the operation of imaging hardware and (2) identify sources of various preprocessing parameters generally assigned by default in commercial application software. With an increasing trend toward customization and embedded design, this approach also offers a framework to select and evaluate imaging hardware and functions, and to sequence individual processing functions in the context of specific application requirements. Furthermore, since such requirements may be subsumed in the early stages of hardware design, selection, integration, and algorithm development, this book offers the theoretical foundations necessary to adapt many generic results. I hope that these design details and mathematical concepts will enable readers to effectively integrate the front-end hardware and preprocessing functions into their application platform.

Although I have included a significant number of original derivations, I have drawn much of the material from the literature. I have attempted to cite original sources as far as possible; however, due to the continued growth of the related subject areas and the large number of publications that host imaging literature, my reference lists are incomplete. While I have taken care to ensure that all derivations and supporting algorithmic descriptions are correct, some errors and omissions are likely to be present due to the involved nature of the analytical work. I will take responsibility for such errors and would appreciate it if readers brought them to my attention.

P. K. Sinha
December 2011


© SPIE. Terms of Use
Back to Top