A simulated annealing band selection approach for hyperspectral images

A novel band selection technique developed for feature extraction of hyperspectral images uses a heuristic optimization algorithm.
29 January 2007
Yang-Lang Chang and Jyh-Perng Fang

State-of-the-art sensors can make use of a growing number of spectral bands. Data initially developed in a few multispectral bands today can be collected from several hundred hyperspectral and even thousands of ultraspectral bands. This recent technology finds application in many domains, including satellite-based geospatial technology, monitoring systems, medical imaging, and industrial product inspection. While images are continuously being acquired and archived, existing methods have proved inadequate for analyzing such large volumes of data. As a result, a vital demand exists for new concepts and techniques for treating high-dimensional datasets.

A common issue in hyperspectral image classification is how to improve class-separability without incurring the curse of dimensionality. This problem has occupied various research communities, including statistics, pattern recognition, and data mining. Researchers all describe the difficulties associated with the feasibility of distribution estimation. Accordingly, selecting the most valuable and meaningful information has become ever more important. One consequence is the development of a technique known as simulated annealing (SA) for feature extraction of high-dimensional datasets using an optimization algorithm.

Simulated annealing

A common technique in metallurgy, SA denotes the slow-cooling melt behavior in the formation of hardened metals. Two decades ago scientists recognized the similarities between a simulated annealing process and a best-solution search for a combinatorial optimization problem.1 SA provides an annealing schedule that starts at an effective high temperature and gradually decreases until it is slightly above zero. The heuristic optimization algorithm is run in a nested loop at various designated temperatures. Advantages of SA include escape from local minima at non-zero temperatures, early appearance of gross features of final state at highest temperatures, and emergence of some finer details at lower temperatures.

Greedy modular eigenspaces

A visual correlation matrix pseudo-color map (CMPM) is used in Figure 1 to emphasize the second-order statistics in hyperspectral data and to illustrate the magnitude of correlation matrices in the proposed SA method. Also shown in Figure 1 is a greedy modular eigenspaces (GME) set for class k, which we previously proposed.2, 3 Reordering the bands in terms of wavelengths in high-dimensional data sets, without regard for the original order, is an important characteristic of the GME. Each modular eigenspace includes a set of highly correlated bands.

Figure 1. An example illustrating (a) an original CMPM, (b) its corresponding correlation matrix for class k, (c) the GME set for class k, and (d) its corresponding correlation matrix after SA band selection. (Click to enlarge.)

In hyperspectral imagery, GME was developed by clustering highly correlated bands into a smaller subset based on the greedy algorithm. Instead of adopting the paradigm underlying this approach, we introduce SA band selection (SABS),4 which takes sets of non-correlated bands for hyperspectral images based on an optimization algorithm. It utilizes the inherent separability of the different classes embedded in high-dimensional datasets to reduce dimensionality and formulate a unique GME feature.

Our proposed SABS scheme has a number of merits. Unlike traditional principal component analysis, it avoids the bias problems that arise from transforming the information into linear combinations of bands. In addition, SABS can readily select each band by a simple logical operation, known as feature-scale uniformity transformation,3 and it can also sort different classes into the most common subset. Finally, it speeds up the procedure to simultaneously select the most significant features according to the SA optimization scheme.

Fine-tuning of SABS algorithm

The basic components for the SA algorithm include solution space, neighborhood structure, cost function, and annealing schedule. Solving an optimization problem using SA requires proper arrangement of these components.5, 6 The solution space is defined by all possible combinations of swapping rows and columns in the original CMPM.4 The neighboring structure is constructed by randomly swapping two bands in the associative correlation matrix, and the cost function is obtained by accumulating the correlation coefficient values inside each GME modular eigenspace. As regards the annealing schedule, probabilities of obtaining higher costs decrease as temperatures fall. The rate of temperature decrease is 0.95 and the terminating temperature is 100°C. The factor determining the number of perturbations at a specified temperature is 20.

Results and conclusion

The performance of our proposed method was evaluated by MODIS/ASTER airborne simulator (MASTER) images for land cover classification during the Pacific Rim (PacRim) II campaign. A minimum distance classifier was used to evaluate classification accuracies. We examined the effectiveness of SABS with initial temperatures ranging from 100,000°C to 900,000°C. Table 1 summarizes the accuracies for different correlation coefficients and initial temperatures. Encouraging results showed that average classification accuracy is above 90%. Compared with conventional feature-extraction techniques, SABS evinced improved discriminatory properties, crucial to subsequent classification. It also made use of potentially significant separability to select a unique set of the most important feature bands embedded in high-dimensional datasets.

Table 1. The classification accuracies for different correlation coefficients and initial temperatures. (Click to enlarge.)

Yang-Lang Chang
National Taipei University of Technology
Department of Electrical Engineering, Taiwan

Yang-Lang Chang is an associate professor in Department of Electrical Engineering, National Taipei University of Technology. He is a member of SPIE, IEEE, Phi Tau Phi, Chinese Society of Photogrammetry and Remote Sensing, and Chinese Society of Image Processing and Pattern Recognition. His research interests are in pattern recognition and remote sensing image processing. In addition, he has written several papers for SPIE's conferences and journals. He also serves as a reviewer for SPIE journal of Optical Engineering.

Jyh-Perng Fang is currently an associate professor in Department of Electrical Engineering, National Taipei University of Technology. His research interests are in electronics design automation, embedded systems, and E-health.


Recent News
PREMIUM CONTENT
Sign in to read the full article
Create a free SPIE account to get access to
premium articles and original research