The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences. Vol. XXXVII. Part B7. Beijing 2008
Figure 2 shows the filters for a = 1.0720. Figure 3 shows the
resulting scaling function and wavelets. Table I lists the
coefficients of the filters \ , \ and h 2 .
2.3.5 Two-Dimensional Extension: The 2D extension can
be obtained by alternating between rows and columns, as is
usually done for typical discrete wavelet transforms. The
corresponding filter bank, which is illustrated in Fig.4, is
iterated on the lowpass branch (the first branch).
3. THE PROPOSED FUSION METHOD
To apply any of the methods of image fusion described in this
paper, the MS image and the PAN image must be accurately
superimposed. Thus, both images must be co-registered, and the
MS image resampled to make its pixel size the same as the
PAN image. In order to achieve this, a robust registration
technique and a bi-cubic interpolator were used.
Figure 4. An oversampled filterbank for a 2-D image
3.1 FIHS Fusion Method
The FIHS fusion for each pixel can be formulated by the
following procedure (Tu et al., 2001):
F(R)
R + (PAN - I)'
F(G)
=
G + (PAN - I)
F(B)
B + (PAN - I)
where F(X) is the fused image of the X band, for X = R, G, and
B, respectively.
3.2 The Hybrid Method proposed by Gonzilez-Audicana et
al.
A multiresolution wavelet decomposition is used to execute the
detailed extraction phase, and the IHS procedure is followed to
inject the spatial detail of the PAN image into the MS image. In
other words, instead of using the PAN image in Eq. (15), the
results of the PAN image and the intensity image fused by the
substitutive wavelet method is used. The fusion results of the
PAN image and the intensity image are expressed as follows:
U=i, + Zw rABt , (16)
k=l
where I r is the low-frequency version of the wavelet-
n
transformed intensity image and ]>] W PANk is the sum of high-
k=l
frequency versions of the wavelet-transformed PAN image.
3.3 The Proposed Hybrid Method
Assume that, without the loss of generality, the hybrid method
is based on the FIHS fusion method instead of on the traditional
IHS method. This is because Eq. (15) holds.
The hybrid method can be simplified with the following
procedure:
R + Z W (PAN.|) k
k= I
F(R)
R + (I n „
- I)'
k= 1
F(G)
=
G + (I new
- I)
=
G + £ W (PAN , K
F(B)
B + (I„ e „
-
k® 1
b + £ w (PAN , )k
k=l
where ^ W (PAN _ 1)k is the sum of the high-frequency versions of
k=l
the wavelet-transformed difference image of the PAN image
and the I image.
As a result, we easily obtained fused images with the fast
scheme of the hybrid method: We simply added to each MS
image the detailed information extracted from the difference
image of the PAN image and the intensity image. Therefore, the
proposed hybrid method is much simpler and faster than the
hybrid method.
3.4 IKONOS Pan-sharpening Technique
When IHS-like fusion methods are used with IKONOS imagery,
there is a significant color distortion, due primarily to the
extensive range of wavelengths in an IKONOS PAN image.
This difference obviously induce the color distortion problem in
IHS fusion as a result of the mismatch; that is, the PAN image
and the intensity image are spectrally dissimilar. To minimize
the radiance differences between the I image and the PAN
image, Tu et al. (2004) introduced the near-infrared (NIR) band
with spectral adjustment applied to the I image, considering that
j,_R + a*G + 6*B + NIR
where a and b are weighting parameters defined to take into
account that the spectral response of the PAN image does not
cover that of the blue and green band. The value of these
parameters was estimated experimentally after the fusion of 92
IKONOS images, covering different areas. According to the
experimental results obtained by Tu et al. (2004), the best
1277