— oC oU cc
U) e OP ry
wu och
International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Vol XXXV, Part B7. Istanbul 2004
Image to the left side is the quality matrix that is made up of
the correlation coefficients calculated for QuickBird
panchromatic band and the original multispectral band-1. The
image to the right shows the corresponding quality matrix for
the QuickBird panchromatic band and the fused multispectral
band-1. By doing this, the responses of the some geographic or
physical features to the fusion algorithm can be detected. For
example the image on the left side is darker than the one on the
right side. This implies that the image on the right side is made
up of larger correlation coefficients. This result is proven from
the Table 2. The over all correlation coefficient for original
multispectral band-1 and the fused band-l is 0.69 and 0.77
respectively.
6. CONCLUSIONS
It is demonstrated that, when different sensors are used, the
image co-registration becomes important. Even if two input
images have the same projection and datum, they are generated
independently with different processing steps, sensor models,
trajectory data and ground truth. It is observed that the same
ground features in the Ikonos and QuickBird images have
apparent mis-registration. For this reason, while preparing the
images for the fusion process, careful attention must be given
to make sure that the same pixels in the two images represent
the same geographic position in the field.
For different sensors, the temporal difference between the
acquisitions of the two images also causes some problems. If a
feature in one of the images is not exist anymore, or changed
partially, this will result in poor quality in the fused image.
In general, a good fusion approach should retain the maximum
spatial and spectral information from the original images and
should not damage the internal relationship among the original
bands. Based on these three criteria, correlation coefficients are
used to quantitatively evaluate the image fusion results. The
higher correlation coefficients between the panchromatic image
and the fused image imply the improvement in spatial content
when compared to the correlation coefficient calculated for
panchromatic and original multispectral images. Likewise, a
fused image should have high correlation to the corresponding
original multispectral image to retain spectral information. In
addition, the fused multispectral images should preserve the
same correlation properties as the ones of the original
multispectral images. Therefore, their difference needs to be
small. Fusion quality can also be evaluated locally, where
correlation coefficients are calculated within a neighborhood of
a pixel. In this way, the proposed quality measure can help
understand the responses of different geographic features to the
fusion algorithm. It is shown that the fused image have over 0.9
correlation except band 1 (blue band) with the original
multispectral images, and 0.7 correlation with the panchromatic
image. It is the highest comparing the tested exiting fusion
methods: PCA, Brovey and multiplicative. This reflects a good
retaining of both spatial and spectral information during the
fusion process.
This study is a successful experience with the wavelet
transform based fusion approach. It is shown that proposed
wavelet transform approach improves the spatial resolution of a
multispectral image while it also preserves much portion of the
spectral component of the image. Some features that can not be
perceived in the original multispectral images are discernable
in the fused ones. By properly designing the rules in combining
the wavelet transform coefficients, color distortion can be
minimized. Fusion results preserve the same color appearance
1249
as the original multispectral images, even when images
collected by different sensors are involved.
Finally, different wavelets tend to yield different fusion quality.
This is observed when comparing the fusion results obtained
from Haar and Daubechies wavelets. It is shown Haar wavelet
may cause the effect of squared feature boundary, where
Daubechies wavelet presents a smooth and natural transition.
This topic along with further formulation of fusion quality will
be our future effort of study.
7. REFERENCES
Chavez, PS. Ir, Sides, S.C. and Anderson, LA. 199].
Comparison of three different methods to merge
multiresolution and multispectral data: Landsat TM and SPOT
panchromatic, Photogrammetric Engineering and Remote
Sensing, vol.57 (3), pp.295-303.
Chibani, Y., Houacine, A., 2002. The joint use of HIS
transform and redundant wavelet decomposition for using
multispectral and panchromatic images. /nt. J. Remote Sensing,
Vol.23, No.18, pp. 3821-3833.
Du, Y., Vachon, P.W., van der Sanden, J.J., 2003. Satellite
image fusion with multiscale wavelet analysis for marine
applications: preserving spatial information and minimizing
artifacts (PSIMA). Can. J. Remote Sensing, Vol. 29, No. |, pp.
14-23.
Hill, P., Canagarajah, N., Bull, D., 2002. Image fusion using
complex wavelets. Proceedings of the 13th British Machine
Vision Conference, University of Cardiff 2-5 September 2002.
Li, H., 1994. Multi-sensor image fusion using the wavelet
transform. [CIP-94., IEEE International Conference, Vol.
1, 13-16 pp. 51-55.
Misiti, M., 2002. Wavelet toolbox for use with matlab.
Wavelet toolbox user’s guide by The MathWorks Inc.
http://www.mathworks.com/access/helpdesk/help/pdf. doc/wavl
et/wavelet. ug.pdf. (accessed 04/15/2004)
Nikolov, S., Hill, P., Bull, D., Canagarajah, N., 2001. Wavelets
for image fusion. In: Wavelets in Signal and Image Analysis.
Kluwer, Netherlands, pp.213-239. Editors: Petrosian, A.A.
and Meyer, F.G.
Pohl, C., van Genderen, J. L., 1998. Multisensor image fusion
in remote sensing: conceps, methods and applications. /nt. J.
Remote Sensing, vol.19, No.5, pp. 823-854.
Ranchin, T., Aiazzi, B., Alparone, L., Baronti, S., and Wald, L.,
2003, Image fusion—the ARSIS concept and some successful
implementation schemes. /SPRS Journal of Photogrammetry
and Remote Sensing 58, Issues 1-2, June, pp. 4 — 18.
Zhou, J., Civco, D.L. and Silander, J.A, 1998, A wavelet
transform method to merge Landsat TM and SPOT
panchromatic data. /nt. J. Remote Sensing, vol.19, No.4, pp.
743-757.