You are on page 1of 12

ISPRS Journal of Photogrammetry & Remote Sensing 61 (2007) 381 392

www.elsevier.com/locate/isprsjprs

FFT-enhanced IHS transform method for fusing


high-resolution satellite images
Yangrong Ling a,, Manfred Ehlers b , E. Lynn Usery c , Marguerite Madden d
a
GeoResources Institute, Mississippi State University, Mississippi State, MS 39762, USA
Institute for Geoinformatics and Remote Sensing, University of Osnabrueck, Osnabrueck, Germany
Center of Excellence for Geospatial Information Science, U.S. Geological Survey, Rolla, MO 65401, USA
d
Department of Geography, University of Georgia, Athens, GA 30602, USA
b

Received 13 April 2006; received in revised form 31 October 2006; accepted 7 November 2006
Available online 20 December 2006

Abstract
Existing image fusion techniques such as the intensityhuesaturation (IHS) transform and principal components analysis
(PCA) methods may not be optimal for fusing the new generation commercial high-resolution satellite images such as Ikonos and
QuickBird. One problem is color distortion in the fused image, which causes visual changes as well as spectral differences between
the original and fused images. In this paper, a fast Fourier transform (FFT)-enhanced IHS method is developed for fusing new
generation high-resolution satellite images. This method combines a standard IHS transform with FFT filtering of both the
panchromatic image and the intensity component of the original multispectral image. Ikonos and QuickBird data are used to assess
the FFT-enhanced IHS transform method. Experimental results indicate that the FFT-enhanced IHS transform method may improve
upon the standard IHS transform and the PCA methods in preserving spectral and spatial information.
2006 International Society for Photogrammetry and Remote Sensing, Inc. (ISPRS). Published by Elsevier B.V. All rights reserved.
Keywords: Image fusion; FFT; IHS transform; PCA; Wavelet; Ikonos; QuickBird

1. Introduction
During sensor design, for the purpose of receiving
enough energy at a specific dwelling time, a panchromatic band covering a broad range of the wavelength
spectrum, and several multispectral bands each covering
a narrow spectral range are specified at different spatial
resolutions. Most high-resolution Earth observation satellite systems such as Ikonos and QuickBird, therefore,
provide two types of image data: a panchromatic image
with high-spatial resolution and a multispectral image
Corresponding author.
E-mail address: yangrong@gri.msstate.edu (Y. Ling).

with lower spatial resolution, but higher spectral resolution. To effectively utilize such images, image fusion
techniques that can effectively combine the high-resolution panchromatic and low-resolution multispectral
images into one color image are needed. Such techniques can largely extend the application potential of
remote sensing image data.
The fusion of high-resolution panchromatic and lowresolution multispectral satellite images is a very
important issue for many remote sensing and mapping
applications (Zhang, 2002a). It is the aim of image
fusion to integrate image data recorded at different resolutions or by different sensors in order to obtain more
information than can be derived from a single image

0924-2716/$ - see front matter 2006 International Society for Photogrammetry and Remote Sensing, Inc. (ISPRS). Published by Elsevier B.V.
All rights reserved.
doi:10.1016/j.isprsjprs.2006.11.002

382

Y. Ling et al. / ISPRS Journal of Photogrammetry & Remote Sensing 61 (2007) 381392

alone (Pohl and Van Genderen, 1998). There are many


situations in remote sensing that simultaneously require
high spatial and high spectral resolution in a single
image. By combining or fusing images it may be possible to obtain both high spatial and spectral resolution
in a single display. The benefit of image fusion has been
demonstrated in many practical applications such as
studies of urban analysis, vegetation, land-use, and
precision farming (e.g., Couloigner et al., 1998; Dai and
Khorram, 1999; Kurz and Hellwich, 2000; Lau et al.,
2000; Tu et al., 2001; Chen et al., 2003; Sun et al.,
2003).
A variety of image fusion methods have been
developed in the past two decades (e.g., Cliche et al.,
1985; Price, 1987; Welch and Ehlers, 1987; Chavez
et al., 1991; Ehlers, 1991; Shettigara, 1992; Yesou et al.,
1993; Zhou et al., 1998; Zhang, 1999, 2002a; Li et al.,
2002; Cakir and Khorram, 2003; Chen et al., 2005;
Zhang and Hong, 2005). Pohl and Van Genderen (1998)
provided a comprehensive review on multisensor image
fusion in remote sensing. They grouped the existing
image fusion techniques into two classes: (1) color related techniques such as intensityhuesaturation (IHS)
and huesaturation-value (HSV) fusion methods; and
(2) statistical/numerical methods such as principal
components analysis (PCA), high pass filtering (HPF),
Brovey transform (BT), regression variable substitution
(RVS), and wavelet methods. Ranchin and Wald (2000)
distinguished the fusion methods into three groups: the
projection and substitution methods, the relative spectral
contribution methods, and those relevant to the ARSIS
(acronym for the French: Amlioration de la Rsolution Spatiale par Injection de Structures, which means
spatial resolution enhancement by injection of structures) concept. There are also some hybrid methods

Fig. 1. Camp Lejeune study area, North Carolina.

Table 1
Image data used in the study
Image data

Spectral
bandwidth
(m)

Ikonos Pan
0.450.90
Ikonos XS
Band 2: 0.510.60
(Bands 2, 3, 4) Band 3: 0.630.70
Band 4: 0.760.85
QuickBird Pan
0.450.90
QuickBird XS
Band 2: 0.520.60
(Bands 2, 3, 4) Band 3: 0.630.69
Band 4: 0.760.85

Spatial
Acquisition date
resolution
(m)
1
4

2/5/2000
8/27/2001

0.61
2.44

3/3/2003
3/24/2003

that use combined methods from more than one group


(e.g., Nuez et al., 1999).
Problems and limitations associated with the available fusion techniques have been reported by many
studies (e.g., Chavez et al., 1991; Pellemans et al., 1993;
Wald et al., 1997; Van Der Meer, 1997; Zhang, 2002b).
According to these studies, the most significant problem
is that the fused image usually has a notable deviation in
visual appearance and in spectral values from the original image. These deviations, called color distortion,
affect further interpretation, especially when the wavelength range of a panchromatic image does not correspond to that of the employed multispectral image.

Fig. 2. Schematic diagram for the FFT-enhanced IHS transform


method.

Y. Ling et al. / ISPRS Journal of Photogrammetry & Remote Sensing 61 (2007) 381392

Multi-temporal images are another problem since the


panchromatic and multispectral data are often taken at
different seasons or years. In image fusion, it is desirable
to minimize the color distortion since this ensures that
features separable in the original multispectral image are
still separable in the fused image (Chavez et al., 1991).
Among the existing fusion methods, the IHS transform and PCA methods are the most commonly used
algorithms by the remote sensing community (Zhang,
1999; Tu et al., 2001). These two techniques have
proved promising for fusing radar or SPOT panchromatic images with Landsat TM or other multispectral
images. In the fusion of the new generation satellite
images such as Landsat Enhanced Thematic Mapper
Plus (ETM+), Ikonos, and QuickBird, the wavelength
extension of the panchromatic band into the near infrared, which causes spectral non-overlapping between
the panchromatic band and the multispectral bands, may
result in significant color distortion in the fused image
when the standard IHS transform and PCA methods are
employed (Zhang, 2002b). Consequently, the objective
of this study is to enhance the standard IHS transform
method using advanced image analysis techniques such
as fast Fourier transform filtering and make it suitable
for fusing the new generation high-resolution satellite
images. Instead of using a total replacement of the
intensity component as in the standard IHS transform
method, the new methods replace the high frequency
part of the intensity component only. This partial
replacement approach may also be applicable to
enhance the PCA method (replacing the high frequency
part of the first component with that of the panchromatic
image).
2. Study area and image data
The study area is located in Camp Lejeune (34 35 N
latitude, 77 18 W longitude), southeastern North
Carolina (Fig. 1). Camp Lejeune lies in the coastal plain
with relative flat terrain (typically relief is less than
25 m) (Onslow County, 2005). Two datasets for the
Camp Lejeune study area are used in this research. One
is a QuickBird dataset over a built-up area with urban
features, marked as A in Fig. 1. This area includes a
populated subdivision with houses and associated trees,
lawns, and shrubs. Roads in this area wind through
dense vegetation and are partially obscured. Also, included in this area is a large golf course with greens,
trees, and sand traps. The populated subdivision is
adjacent to a retail and commercial section of the city
including roads, buildings, parking lots, and other features that reflect the concrete and asphalt of an urban

383

area. The other dataset is an Ikonos image over a complex region, marked as B in Fig. 1, with sea, land, and air
features. The land area in this image contains different types of vegetation including mixed conifer and
deciduous forests, grassy areas, and some vegetated
wetlands.
Each dataset contains a panchromatic band and three
multispectral bands since the IHS method can only fuse
one panchromatic band with three multispectral bands at
a time (Table 1). The images are spatially registered to
the Universal Transverse Mercator (UTM) coordinate
system on the WGS 84 datum.
3. Methodology
3.1. The standard IHS transform method for image
fusion
The IHS color transform can effectively convert a
multispectral image from standard redgreenblue
(RGB) color space to IHS color space. Among the advantages of IHS transform operations is the ability to
vary each IHS component independently, without
affecting the others (Lillesand et al., 2004). This property
may be used for the fusion of multi-sensor images. The
basic steps of IHS fusion are: (1) register the input
multispectral image to the panchromatic image if needed
and then resample it to the same spatial resolution as that
of the panchromatic image; (2) transform the input
multispectral image from RGB to IHS color space. The
mathematical context is expressed by Eqs. (1) and (2).
I relates the intensity, while v1 and v2 are intermediate
variables, and H and S stand for hue and saturation,
respectively (Carper et al., 1990); (3) replaces the
intensity component by a panchromatic image with a
higher spatial resolution; and (4) transforms the replaced
intensity component, together with original hue and
saturation components, back to RGB color space to
create the fused image (Eq. (3)). To enhance the operation of this process and better retain the spectral
information of the original
0

I
@ m1 A
m2

H tan1

1
3
1
p
6
1
p
2

1
3
1
p
6
1
p
2

1
3
1
p
6
0

 
q
m2
; S m21 m22
m1

1
R
@GA
B

384

Y. Ling et al. / ISPRS Journal of Photogrammetry & Remote Sensing 61 (2007) 381392

1
0
1 B 1 p6
B
Rnew
B
@ Gnew A B 1 p1
B
6
B
Bnew
@
2
1 p
6

1
1
p C0
2C I 1
new
1C
@ m1 A
C
2C
C m2
A
0

image, one can use a Fourier transform to decompose the


intensity component and replace only a part of it. The
application of this method is discussed below.
3.2. FFT-enhanced IHS transform method
3.2.1. Fourier transform and image filtering in the
frequency domain
The Fourier transform is an important image processing tool that is used to decompose an image into its
sine and cosine components. The Fourier transform of a
two-dimensional function can be expressed as the following equations (Gonzalez and Woods, 2002):
Z lZ l
Hu; m
hx; yej2kuxmy dxdy
4
l

Where
p
j 1 and eF jx cosxFjsinx:

The output of the transformation represents the image


in the Fourier or frequency domain, while the input
image is the spatial domain equivalent. In the Fourier
domain image, each point represents a particular frequency contained in the spatial domain image. If a
Fourier domain image is known, it can be transformed
back to the spatial domain using an inverse Fourier
transform:
Z lZ l
hx; y
Hu; mej2kuxmy dudm
6
l

Since digital image processing involves a finite number


of discrete samples (i.e., pixels), a modified form of the
Fourier transform, known as the discrete Fourier transform (DFT), is used in Fourier image analysis. A computationally efficient implementation of the DFT is the fast
Fourier transform (FFT).
Image filtering in the frequency domain has been
widely used in image processing since convolution by
multiplication in the frequency domain is computationally faster than conventional convolution in the spatial
domain, especially as the filter size increases. Also, some
signals are easier to visualize and take less information to
define in frequency domain (Smith, 1999).

3.2.2. Image fusion with FFT-enhanced IHS transform


method
The basic idea behind the FFT-enhanced IHS transform method is to modify the input high-resolution
panchromatic image so it looks more like the intensity
component of the input multispectral image. Instead of
using a total replacement of the intensity component,
this method uses a partial replacement based on FFT
filtering.
In the FFT-enhanced IHS transform method, the
multispectral image is first transformed using the IHS
transform. A frequency domain analysis is used to select
a low-pass filter for the original intensity component and
a high-pass filter for the panchromatic image. The idea
is to substitute the high frequency part of the intensity
component with that from the panchromatic image.
Both the low-pass and high-pass filters should be complementary, i.e., the high frequency part removed from
the intensity component should be the only part left in
the panchromatic image. After the replacement of the
high frequency information from the panchromatic
image, the new intensity component, together with the
original hue and saturation components, is transformed
back to RGB color space to obtain the fused image
(Fig. 2). In our study, different filters such as Butterworth, Gaussian, and Hanning filters are examined with
different parameters, and the Hanning filters with a
circle radius of 32 pixels produced the best results.
There are also some considerations to keep in mind
when transforming data to the frequency domain via the
FFT. For example, to avoid artifacts, the dimension of
the image must be powers of 2 (i.e. 2j 2k where j and k
are integers) since the FFT algorithm recursively divides
the data. If the dimension of the image is different from a
power of 2, one can expand the next legitimate by
surrounding the image with zeros. This is called zeropadding. To avoid the effect of zero-padding, one may
trim the image to the next valid size. The procedure of
image fusion using the FFT-enhanced IHS transform
method is summarized in Table 2.
The partial replacement used in the FFT-enhanced
IHS transform method is similar to the use of wavelet
transform in the approaches of Nuez et al. (1999) and
Zhang and Hong (2005), in which high-resolution
information from the panchromatic image is injected
into the intensity image. However, there are also some
differences among these methods. In the approach of
Nuez et al. (1999), the high-resolution information
from the panchromatic image was added directly to the
intensity component, and the corresponding high-resolution information of the intensity component was not
removed before the addition. The IHS and wavelet

Y. Ling et al. / ISPRS Journal of Photogrammetry & Remote Sensing 61 (2007) 381392

385

Table 2
The procedure of image fusion using the FFT-enhanced IHS transform method
1 Register all images to the panchromatic image and resample to the highest resolution (e.g. Ikonos 1 m, QuickBird 0.61 m) using cubic
convolution.
2 Transform the resampled multispectral image from the RGB to IHS color space to obtain the intensity (I ), hue (H ), and saturation (S ) components.
3 Low-pass filtering of the intensity component, I, by designing an appropriate filter in the Fourier domain.
4 High-pass filtering of the panchromatic image in the Fourier domain.
5 Add the high frequency filtered panchromatic image to the low frequency filtered intensity component, I, for the new intensity component, I'.
6 Match I' to the original I to obtain a new intensity component, I ".
7 Perform an IHS to RGB transform on I ", together with the original hue (H ) and saturation (S ) components, to create the fused image.

integrated approach of Zhang and Hong (2005), on the


other hand, utilizes wavelet transform to decompose
both the panchromatic and intensity images, followed
by a fixed replacement of the wavelet coefficients (LH,
HH, HL) of the decomposed intensity image with those
from the decomposed panchromatic image to create the
new intensity image. In the FFT-enhanced IHS method
used in this study, one can apply different filtering
windows with different parameters in the Fourier
analysis to obtain best fusion results. In this sense, the
FFT-enhanced IHS method could be more flexible than

the approaches of Nuez et al. (1999) and Zhang and


Hong (2005).
4. Experiments and results
To assess the quality of the fused images, fusion
results from different techniques should be evaluated
visually, spectrally, and spatially (Zhou et al., 1998).
Change in spectral characteristics is an indication of the
change in radiometric content of the image. Minimizing
distortion of the spectral characteristics is important

Fig. 3. A representative portion of the original and fused Ikonos images: (a) Ikonos panchromatic image (1 m); (b) Ikonos multispectral image (4 m);
(c) fused image from the FFT-enhanced IHS transform method (1 m); (d) fused image from the IHS and wavelet integrated method (1 m); (e) fused
image from the IHS method (1 m); (f ) fused image from the PCA method (1 m).

386

Y. Ling et al. / ISPRS Journal of Photogrammetry & Remote Sensing 61 (2007) 381392

Fig. 4. Histograms of the original multispectral and fused Ikonos images from the FFT-enhanced IHS method, the IHS and wavelet integrated method,
the IHS transform method, and the PCA method. The first column is for band 2, the second column for band 3, and the last column band 4.

since this ensures that features spectrally separable in


the original dataset are still separable in the fused dataset
(Chavez et al., 1991). In this study, several evaluation
methods from the work of Wald et al. (1997), Li et al.
(2002), Cakir and Khorram (2003), and Wang et al.

(2004) are employed to assess the quality of the fused


image.
Different comparisons are conducted to evaluate the
FFT-enhanced IHS transform method by comparing the
fused image with the outputs obtained from other

Table 3
Inter-band correlation for the original multispectral and fused Ikonos
images

Table 4
Correlation between the panchromatic image and other bands for the
original multispectral and fused Ikonos images

Bands 2 and 3 Bands 2 and 4 Bands 3 and 4


Original image
FFT-enhanced IHS
IHS and wavelet
integrated
IHS
PCA

0.98
0.98
0.98

0.17
0.17
0.17

0.26
0.27
0.27

0.98
0.98

0.20
0.09

0.32
0.18

Original
FFT-enhanced IHS
IHS and wavelet integrated
IHS
PCA

Pan and
band 2

Pan and
band 3

Pan and
band 4

0.57
0.60
0.60
0.61
0.50

0.65
0.67
0.65
0.69
0.56

0.62
0.66
0.65
0.82
0.85

Y. Ling et al. / ISPRS Journal of Photogrammetry & Remote Sensing 61 (2007) 381392
Table 5
Correlation coefficients between the original multispectral and fused
Ikonos images

FFT-enhanced IHS
IHS and wavelet integrated
IHS
PCA

Band 2

Band 3

Band 4

0.98
0.97
0.95
0.96

0.98
0.98
0.94
0.94

0.92
0.91
0.82
0.81

387

Table 7
Correlation coefficient between the intensity component of the fused
Ikonos image and the Ikonos panchromatic image
FFT-enhanced IHS
IHS and wavelet integrated
IHS
PCA

0.98
0.97
0.89
0.79

4.1. Fusion of Ikonos images


methods including the IHS and wavelet integrated
method of Zhang and Hong (2005), the standard IHS
transform method, and the PCA method. First, the fused
images are compared to the original multispectral image
using visual means. To take the statistical distribution
into account, a stacked image with results from different
fusion methods and the original multispectral image are
constructed, and then false color composites of the fused
images are generated from the stacked image for visual
comparison. Histograms of the fused images are also
checked and compared to that of the original multispectral image. To quantify the spectral and spatial changes,
inter-band correlations and the correlations between the
panchromatic image and other bands of the multispectral image are analyzed for both the fused images and the
original multispectral image. The correlation coefficient
and band discrepancy (Li et al., 2002) between the fused
image and corresponding original multispectral image
are also calculated to assess the spectral quality of the
fused images. The coefficient between the intensity
component of the fused image and that of the panchromatic image is used as a measure to assess the spatial
quality. Finally, to evaluate the structural information
of the fused image, the mean structural similarity
(MSSIIM) index (Wang et al., 2004) is computed for
measuring the similarity between the fused and original
multispectral images. Test data consist of Ikonos and
QuickBird images over the Camp Lejeune study area.
The tested fusion methods include the FFT-enhanced
IHS transform, the IHS and wavelet integrated, the
standard IHS transform, and the PCA methods.

Table 6
Computed band discrepancy between the original multispectral and
fused Ikonos images

FFT-enhanced IHS
IHS and wavelet integrated
IHS
PCA

Band 2

Band 3

Band 4

5.04
7.75
25.19
16.43

4.57
6.26
37.56
26.05

7.84
8.20
16.15
9.86

Note: Lower values indicate more similar images while higher values
indicate more discrepancy.

Fig. 3 shows the fusion results of the Ikonos images.


A representative portion of the whole image showing a
vegetated and bare ground area was selected for comparison of the fusion techniques. Fig. 3a and b are the
original Ikonos panchromatic image and multispectral
image, respectively. The fused images from the FFTenhanced and the IHS and wavelet integrated methods
are shown in Fig. 3c and d, respectively. Fig. 3e and f are
the fused images from the traditional IHS transform and
the PCA methods.
Visual comparison reveals that all the fused images
inherited high spatial information from the panchromatic
image. As for spectral quality, Fig. 3c and d preserve the
spectral characteristics and appearance of bare ground
and vegetation in the original multispectral image. For
example, the dark green area, labeled with letter A, has
almost the same color and brightness in Fig. 3c and d as
that in the original multispectral image 3b. By contrast,
in Fig. 3e and f the standard IHS and PCA fused
images the bare ground areas (A) appear reddish brown
and vegetation (B) a darker red. Wetland and damp soil
are also easily distinguished. Thus, in this instance, the
apparent color distortion associated with IHS and PCA
images works to the advantage of the interpreter.
In addition to multiresolution data, this particular
example demonstrates the fusion of multitemporal
image data. The acquisition date of the input Ikonos
Pan image was 20 May 2000, while the multispectral
image was acquired a year later on 27 August 2001. The
input multispectral image, therefore, represents the most
current condition of the landscape and may explain the
difference in color between the IHS/PCA and FFTenhanced/wavelet integrated IHS methods. Specifically,
Table 8
Computed MSSIM index between the original multispectral and fused
Ikonos images

FFT-enhanced IHS
IHS and wavelet integrated
IHS
PCA

Band 2

Band 3

Band 4

0.862
0.850
0.794
0.710

0.853
0.871
0.732
0.679

0.848
0.823
0.693
0.684

388

Y. Ling et al. / ISPRS Journal of Photogrammetry & Remote Sensing 61 (2007) 381392

Table 9
Computed MSSIM index between the Ikonos panchromatic image and
the intensity component of the fused Ikonos image
FFT-enhanced IHS
IHS and wavelet integrated
IHS
PCA

0.856
0.849
0.808
0.786

in the standard IHS and PCA methods, since the intensity component is totally replaced by the panchromatic image, some low frequency information, for
example, the most current land cover information in area
(A) represented in the intensity component of the multispectral image, will not be kept in the fused image. The
color information of the original multispectral image,
then, may be modified by the panchromatic image. On
the other hand, the color information is well preserved in
the fused images from the FFT-enhanced IHS and the
IHS and wavelet integrated methods because only the
high frequency part of the panchromatic image is fused
to the multispectral image. In the cases when the panchromatic image represents the most up-to-date condi-

tion, if the most current conditions for those areas with


dramatic changes are preferred to present in the fused
image, one may replace the low-frequency information
of the multispectral image with those of the panchromatic image via FFT filtering in the FFT-enhanced IHS
method. In this way, the spectral information of the
fused image may still be kept as close as possible to that
of the original multispectral image. From this point of
view, the FFT-enhanced IHS method is more flexible
than the other methods.
To compare with the original multispectral image, the
fused images are resampled to a resolution of the original multispectral image. Ideally, when a fused image is
resampled to its original resolution, the resulting image
should be as close as possible to the original image
(Wald et al., 1997). Checking the histograms of the
fused images after resampling and comparing them with
those of the original multispectral image provides a
quantitative test of this property (Cakir and Khorram,
2003). Histograms of the original multispectral image
and the fused images are presented in Fig. 4. It can be
seen from the figure that the histograms of the fused

Fig. 5. A representative portion of the original and fused QuickBird images: (a) QuickBird panchromatic image (0.61 m); (b) QuickBird multispectral
image (2.44 m); (c) fused image from the FFT-enhanced IHS transform method (0.61 m); (d) fused image from the IHS and wavelet integrated method
(0.61 m); (e) fused image from the IHS method (0.61 m); (f ) fused image from the PCA method (0.61 m).

Y. Ling et al. / ISPRS Journal of Photogrammetry & Remote Sensing 61 (2007) 381392

389

Fig. 6. Histograms of the original QuickBird multispectral image and fused images from the FFT-enhanced IHS method, the IHS and wavelet
integrated method, the IHS transform method, and the PCA method. The first column is for band 2, the second column for band 3, and the last column
band 4.

images from the FFT-enhanced IHS and the IHS and


wavelet integrated methods retain the broad normal
distribution and character, and are closer to those of the
original image than those from the standard IHS transform and the PCA methods.

Inter-band correlation is another property that should


be preserved in image fusion. This property has been
used to quantify the spectral changes resulting from
image fusion (Carper et al., 1990; Cakir and Khorram,
2003). Table 3 compares the inter-band correlation of

Table 10
Inter-band correlation for the original and fused QuickBird images

Table 11
Correlation between the panchromatic image and other bands for the
original and fused QuickBird images

Bands 2 and 3 Bands 2 and 4 Bands 3 and 4


Original image
FFT-enhanced IHS
IHS and wavelet
integrated
IHS
PCA

0.97
0.97
0.97

0.55
0.55
0.55

0.53
0.54
0.53

0.90
0.88

0.43
0.37

0.53
0.61

Original
FFT-enhanced IHS
IHS and wavelet integrated
IHS
PCA

Pan and
band 2

Pan and
band 3

Pan and
band 4

0.91
0.91
0.90
0.83
0.83

0.90
0.90
0.90
0.81
0.74

0.83
0.83
0.83
0.81
0.73

390

Y. Ling et al. / ISPRS Journal of Photogrammetry & Remote Sensing 61 (2007) 381392

Table 12
Correlation coefficients between the original and fused QuickBird
images

FFT-enhanced IHS
IHS and wavelet integrated
IHS
PCA

Band 2

Band 3

Band 4

0.94
0.92
0.73
0.62

0.88
0.88
0.69
0.61

0.89
0.83
0.73
0.79

the fused images to that of the original multispectral


image. While the differences are slight, the fused images
from the FFT-enhanced IHS and the IHS and wavelet
integrated methods have more similar inter-band
correlation to that of the original multispectral image
than the inter-band correlations from IHS transform and
PCA methods. This also is true from the analysis of
correlation between the panchromatic image and each
band of the multispectral image, as shown in Table 4.
The correlation coefficients and band discrepancy
between the original image bands and corresponding
bands of the fused images also are used to assess the
spectral quality of the fused image (Li et al., 2002). The
band discrepancy Dk is computed as
Dk

1XX V
jVkij Vkij j
n i j

where Vkij and Vkij are the pixel values of the fused
image and corresponding original multispectral image,
respectively; k is the kth band and i and j are the ith row
and the jth column, respectively; and n is the total
number of pixels in the image. A small discrepancy
between the fused image and the corresponding original
multispectral image is desired. The correlation coefficient and band discrepancy between each original band
and corresponding band of the fused images are calculated and summarized in Tables 5 and 6. As seen from
these tables, the FFT-enhanced IHS and the IHS and
wavelet integrated methods produce better result with
higher correlation coefficient and less band discrepancy.
The correlation coefficient between the intensity
component (after rescaled to the same value range as

Table 13
Computed band discrepancy between the original and fused QuickBird
images

FFT-enhanced IHS
IHS and wavelet integrated
IHS
PCA

Band 2

Band 3

Band 4

2.32
2.89
3.39
3.54

1.39
2.15
5.54
14.90

2.71
4.28
8.78
13.59

Table 14
Correlation coefficient between the intensity component of the fused
QuickBird image and the QuickBird panchromatic image
FFT-enhanced IHS
IHS and wavelet integrated
IHS
PCA

0.98
0.97
0.87
0.83

that of the panchromatic image) of the fused image and the


panchromatic image is used as a measure to assess the
spatial quality of the fused image. Higher correlation
between the intensity component of the fused image and
that of the panchromatic image indicates that more spatial
information from the panchromatic image is incorporated
during fusion. Table 7 presents the calculated correlation
coefficient between the intensity component of the fused
image and the panchromatic image. It shows that the fused
images from the FFT-enhanced IHS and the IHS and
wavelet integrated methods have higher spatial correlation, which indicates more information from the panchromatic image is incorporated in the fused image.
Finally, the mean structural similarity (MSSIM)
index between the fused image and the original multispectral image is calculated to evaluate the structural
information and overall quality of the fused image. The
MSSIM index is computed as (Wang et al., 2004):
MSSIMX ; Y

M
1X
SSIMxj yj
M j1

where X and Y are the reference and the evaluated


image, respectively; xj and yj are the image contents at
the j local window; and M is the number of local
windows of the image. The structural similarity (SSIM)
index is defined as:
SSIMx; y

2lx ly C1 2rxy C2
l2x l2y C1 r2x r2y C2

where is the mean intensity, is the standard deviation, and C1 and C2 are constants. The SSIM index

Table 15
Computed MSSIM index between the original multispectral and fused
QuickBird images

FFT-enhanced IHS
IHS and wavelet integrated
IHS
PCA

Band 2

Band 3

Band 4

0.898
0.881
0.876
0.822

0.867
0.890
0.846
0.850

0.853
0.845
0.812
0.783

Y. Ling et al. / ISPRS Journal of Photogrammetry & Remote Sensing 61 (2007) 381392
Table 16
Computed MSSIM index between the QuickBird panchromatic image
and the intensity component of the fused QuickBird image

391

5. Conclusion

method provides more favorable quantitative measures


on the performance of the objective quality assessment
than other traditional models such as the mean square
error (MSE) and peak signal-to-noise ratio (PSNR)
(Wang et al., 2004). A higher MSSIM index value
indicates more similarity between the fused image and
the original multispectral image. Tables 8 and 9 present
the computed MSSIM index between the original multispectral and fused Ikonos Images and MSSIM index
between the Ikonos panchromatic image and the intensity component (after rescaled to the same value range
as that of the panchromatic image) of the fused Ikonos
Image, respectively. Both show that the fused images
from the FFT-enhanced IHS and the IHS and wavelet
integrated methods are more similar to the original
multispectral image than those from IHS and PCA
methods.

An FFT-enhanced IHS transform method has been


presented in this paper to fuse the new generation highresolution commercial satellite images. This method
combines a standard IHS transform with an FFT filtering
of both the panchromatic image and the intensity component of the original multispectral image. Experimental
results demonstrate that the FFT-enhanced IHS method
can preserve the spectral characteristics of the input
multispectral image to a greater extent than the standard
IHS transform and PCA methods while inheriting the
spatial integrity from the panchromatic image. Between
the FFT-enhanced IHS and the IHS and wavelet integrated methods, the FFT-enhanced IHS method produces slightly better results. Further improvement of the
FFT-enhanced IHS method is to automate the best design
of filters to modify the intensity component of the original multispectral image. Similarly, the partial replacement approach based on the FFT filtering may also be
suitable for enhancing the PCA method. Other approaches for quality assessment of the fused image, such
as comparison of land-use maps obtained after spectral
(and possibly textural) classification, also warrant future
investigation.

4.2. Fusion of QuickBird images

Acknowledgements

Fig. 5 shows the fusion results of the QuickBird


images of an urban-type setting within Camp Leujune.
The original QuickBird panchromatic and multispectral
images are shown in Fig. 5a and b. Fig. 5c and d are the
fused images from the FFT-enhanced IHS and the IHS
and wavelet integrated methods, respectively. Again,
Fig. 5e and f are the fused images from the traditional
IHS transform and the PCA methods. From the visual
comparison, color distortion can be seen in Fig. 5e and f.
Compared to the results from the Ikonos images, the
color distortion in the fused IHS and PCA QuickBird
images is more noticeable. This may indicate that as the
spatial resolution increases, the color distortion becomes
more serious in the fused image when the standard IHS
and PCA methods are applied.
Fig. 6 compares the histograms of the fused images with that of the original multispectral image, and
Tables 1016 give the results from statistical and structural similarity comparisons. Once again it is evident
that the FFT-enhanced IHS and the IHS and wavelet
integrated methods may yield better results than the
standard IHS transform and PCA methods in terms of
retaining the spectral and spatial values of the original
image.

Support for the Ikonos and QuickBird image data


was provided by the Center for Remote Sensing and
Mapping Science (CRMS), Department of Geography,
the University of Georgia and the National GeospatialIntelligence Agency (NGA) under the Cooperative
Agreement NIMA 201-00-1-1006, Assessing the Ability of Commercial Sensors to Satisfy Littoral Warfare
Data Requirements. The authors would like to express
their appreciation to Dr. Roy Welch, Dr. Richard Brand,
and Dr. Scott Loomer for their initiative and assistance
throughout the project. We also gratefully acknowledge
many individuals of the CRMS who worked on this
project.

FFT-enhanced IHS
IHS and wavelet integrated
IHS
PCA

0.958
0.946
0.923
0.892

References
Cakir, H.I., Khorram, S., 2003. Fusion of high spatial resolution
imagery with high spectral resolution imagery using multiresolution approach. ASPRS Annual Conference Proceedings, May 2003
Anchorage, Alaska, on CD-ROM.
Carper, W.J., Lilesand, T.W., Kieffer, R.W., 1990. The use of
intensityhuesaturation transformation for merging SPOT panchromatic and multispectral image data. Photogrammetric Engineering and Remote Sensing 56 (4), 459467.
Chavez, P.S., Sides, S.C., Anderson, J.A., 1991. Comparison of three
different methods to merge multiresolution and multispectral data:

392

Y. Ling et al. / ISPRS Journal of Photogrammetry & Remote Sensing 61 (2007) 381392

TM & SPOT pan. Photogrammetric Engineering and Remote


Sensing 57 (3), 295303.
Chen, C.M., Hepner, G.F., Forster, R.R., 2003. Fusion of hyperspectral
and radar data using the IHS transformation to enhance urban
surface features. ISPRS Journal of Photogrammetry and Remote
Sensing 58 (1), 1930.
Chen, Y., Fung, T., Lin, W., Wang, J., 2005. An image fusion method
based on object-oriented image classification. Geoscience and
Remote Sensing Symposium, 2005. IGARSS '05. Proceedings,
2005 IEEE International, vol. 6, pp. 39243927.
Cliche, G., Bonn, F., Teillet, P., 1985. Integration of the SPOT
Pan channel into its multispectral mode for image sharpness
enhancement. Photogrammetric Engineering and Remote Sensing
51 (3), 311316.
Couloigner, I., Ranchin, T., Valtonen, V., Wald, L., 1998. Benefit of the
future SPOT-5 and of data fusion to urban roads mapping.
International Journal of Remote Sensing 19 (8), 15191532.
Dai, X., Khorram, S., 1999. Data fusion using artificial neural networks: a
case study on multitemporal change analysis. Computers Environment and Urban Systems 23 (1), 1931.
Ehlers, M., 1991. Multisensor image fusion techniques in remote
sensing. ISPRS Journal of Photogrammetry and Remote Sensing
46 (1), 1930.
Gonzalez, R.C., Woods, R.E., 2002. Digital Image Processing, 2nd
edition. Prentice Hall. 793 pp.
Kurz, F., Hellwich, O., 2000. Empirical estimation of vegetation
parameters using multisensor data fusion. International Archives of
Photogrammetry and Remote Sensing 33 (Part B7), 733737.
Lau, W., King, B.A., Li, Z., 2000. The influences of image
classification by fusion of spatially oriented images. International
Archives of Photogrammetry and Remote Sensing 33 (Part B7),
752759.
Li, S., Kwok, J.T., Wang, Y., 2002. Using the discrete wavelet
transform to merge Landsat TM and SPOT panchromatic images.
Information Fusion 3 (1), 1723.
Lillesand, T.M., Kiefer, R.W., Chipman, J.W., 2004. Remote Sensing
and Image Interpretation, 5th edition. John Wiley & Sons.
Nuez, J., Otazu, X., Fors, O., Prades, A., Pal, V., Arbiol, R., 1999.
Multiresolution-based image fusion with adaptive wavelet decomposition. IEEE Transactions on Geoscience and Remote Sensing
37 (3), 12041211.
Onslow County, North Carolina, 2005. http://www.onslowcountyschools.
org/OCinfosheet.htm (accessed November 3, 2006).
Pellemans, A.H.J.M., Jordans, R.W.L., Allewijn, R., 1993. Merging
multispectral and panchromatic SPOT images with respect to the
radiometric properties of the sensor. Photogrammetric Engineering
and Remote Sensing 59 (1), 8187.
Pohl, C., Van Genderen, J.L., 1998. Multisensor image fusion in
remote sensing: concepts, methods and applications. International
Journal of Remote Sensing 19 (5), 823854.
Price, J.C., 1987. Combining panchromatic and multispectral imagery
from dual resolution satellite instruments. Remote Sensing of
Environment 21 (9), 119128.

Ranchin, T., Wald, L., 2000. Fusion of high spatial and spectral
resolution images: the ARSIS concept and its implementation.
Photogrammetric Engineering and Remote Sensing 66 (1), 4961.
Shettigara, V.K., 1992. A generalized component substitution
technique for spatial enhancement of multispectral images using
a higher resolution data set. Photogrammetric Engineering and
Remote Sensing 58 (5), 561567.
Smith, W.S., 1999. The Scientist and Engineer's Guide to Digital Signal
Processing, Second edition. California Technical Publishing.
Sun, W., Heidt, V., Gong, P., Xu, G., 2003. Signal and image
processing-information fusion for rural land-use classification with
high-resolution satellite imagery. IEEE Transactions on Geoscience and Remote Sensing 41 (4), 883890.
Tu, T.M., Su, S.C., Shyu, H.C., Huang, P.S., 2001. A new look at IHSlike image fusion methods. Information Fusion 2 (3), 177186.
Van Der Meer, F., 1997. What does multisensor image fusion add in
terms of information content for visual interpretation? International
Journal of Remote Sensing 18 (2), 445452.
Wald, L., Ranchin, T., Magolini, M., 1997. Fusion of satellite images
of different spatial resolutions: assessing the quality of resulting
images. Photogrammetric Engineering and Remote Sensing 63 (6),
691699.
Wang, Z., Bovik, A.C., Sheikh, H.R., Simoncelli, E.P., 2004. Image
quality assessment: from error visibility to structural similarity.
IEEE Transactions on Image Processing 13 (4), 600612.
Welch, R., Ehlers, M., 1987. Merging multiresolution SPOT HRV and
Landsat TM data. Photogrammetric Engineering and Remote
Sensing 53 (3), 301303.
Yesou, H., Besnus, Y., Rolet, J., 1993. Extraction of spectral
information from Landsat TM data and merger with SPOT
panchromatic imagery a contribution to the study of geological
structures. ISPRS Journal of Photogrammetry and Remote Sensing
48 (5), 2336.
Zhang, Y., 1999. A new merging method and its spectral and spatial
effects. International Journal of Remote Sensing 20 (10),
20032014.
Zhang, Y., 2002a. Automatic image fusion: a new sharpening
technique for Ikonos multispectral images. GIM International 16
(5), 5457.
Zhang, Y., 2002b. Problems in the fusion of commercial highresolution satellite images as well as Landsat 7 images and initial
solutions. Interational Archives of Photogrammetry and Remote
Sensing 34 (Part 4) (on CD-ROM).
Zhang, Y., Hong, G., 2005. An IHS and wavelet integrated approach to
improve pan-sharpening visual quality of natural colour Ikonos
and QuickBird images. Information Fusion 6 (3), 225234.
Zhou, J., Civco, D.L., Silander, J.A., 1998. A wavelet transform
method to merge Landsat TM and SPOT panchromatic data.
International Journal of Remote Sensing 19 (4), 743757.

You might also like