1197
FUSION OF SAR IMAGES (PALSAR AND RADARSAT-1) WITH MULTISPECTRAL
SPOT IMAGE: A COMPARATIVE ANALYSIS OF RESULTING IMAGES
S. Abdikan 3 ' *, F. Balik Sanli 3 , F. Bektas Balcik b , C. Goksel b
a Yildiz Technical University, Civil Engineering Faculty, Department of Geodesy and Photogrammetry, 34349 Istanbul,
Turkey - (sabdikan, fbalik)@yildiz.edu.tr
b Istanbul Technical University, Civil Engineering Faculty Department of Geodesy and Photogrammetry, 34469Istanbul,
Turkey - (bektasfi, goksel)@itu.edu.tr
Commission VII, WG VII/6
KEY WORDS: Fusion, PALSAR, RADARSAT-1, SPOT, Statistical analysis
ABSTRACT:
Land use information is one of the most useful input element in forming policies concerning to economic, environmental issues at
national and also at global levels. With the view of these different spectral, temporal and spatial qualified sensors are developed. On
the other hand for some applications single sensor imagery might not be enough to gather information from earth surface. Fusion of
satellite images with different spatial and spectral resolutions plays an important role for extracting information. Image fusion is an
application for making use of two or more complementary images/spectral bands of the same or different sensors for the same area
to get more information and to enhance the quality of image interpretation. This research investigates the quality assessment of SAR
data fusion with optical imagery. Two different SAR data from different sensors namely RADARSAT-1 and PALSAR were fused
with SPOT-2 data. Although the PALSAR and the RADARSAT-1 images have the same resolutions and polarisations, images were
gathered in different frequencies (L band and C band respectively). Particularly in this case the operating frequency is a key factor in
the penetration depth to see the affects on extracting information from fused images. This application describes a comparative study
of multisensor image fusion techniques in preserving spectral quality of the fused images. IHS (Intensity-Hue-Saturation), HPF
(High Pass Frequency), two dimensional DWT (Discrete Wavelet Transformation), and PCA (Principal Component Analysis)
techniques are implemented for image fusion. For the statistical analysis bias, correlation coefficient (CC), Difference in Variance
(DIV), Standard Deviation Difference (SDD), Universal Image Quality Index (UIQI) methods were applied on fused images.
Statistically results depict that DCW fusion technique gives better result for both Radarsat-SPOT and PALSAR-SPOT fused images.
1. INTRODUCTION
Remote Sensing multisensor data fusion is used when the
quality increase of quantitative analysis is needed. Due to
complementary information of different characterized spectral
or spatial multi sensor data, image fusion can facilitate image
interpretation (Zhou et al, 1998). Image fusion technique can be
done with several algorithms and integration of multi source
data is of the essence for many applications.
In recent years, the launches of new SAR satellites such as
ENVISAT, ALOS, TERRASAR and RADRSAT-2 have
opened a new era for remote sensing. Previous studies proved
that the combination of optical and microwave data provide
more accurate identification when compared to the results
obtained with the individual sensors (Aschbacher and
Lichtenegger,1990). Since the response of radar is more a
function of geometry and structure than a surface reflection
occurs in optical images (Pohl and van Genderen,1998), using
these multiple types of sensors for image fusion increases the
quality of images. IHS (Li and Wang, 2001, Tsai,2004), Brovey
Transformation (Binh et al.,2006), PCA (Amarsaikhan and
Douglas,2004), HPF (Bethune et al., 1998, Aiazzi et al., 2006a),
DWT (Zhang and Hong,2005, Jin et al,2006), Gram Schmidt
Transformation (GST) (Aiazzi et al,2006b), Smoothing Filter
Based Intensity Modulation (SFIM) (Liu,2000), Synthetic
* Corresponding author: Email: sabdikan@yildiz.edu.tr
Variable Ratio (SVR) (Zhang, 1999) are some of the image
fusion techniques generally used.
There are several studies comparing these techniques such as;
IHS, PCA and Brovey transform methods compared with
wavelet method to achieve the best spectral and spatial quality
(Zhou et al, 1998), IHS and wavelet combination of fusion
method used on IKONOS and Quickbird images to decrease the
colour distortions (Zhang and Hong,2005), Influence of image
fusion on land cover mapping is analyzed by Colditz et al,(2006)
using Brovey transformation, hue-saturation-value (HSV),
principal component analysis (PCA), multi resolution image
fusion technique (MMT), adaptive image fusion (AIF) and
wavelet transformation methods. A multi resolution
decomposition method based on “a trous” algorithm used to
improve the image classification (Teggi et al,2003). Shi et
al,(2005), studied multi-band wavelet based image fusion and
corresponding quality assessment. Mean value used to represent
average intensity of an image, standard deviation, information
entropy and profile intensity curve are used for assessing details
of fused images. Bias and correlation coefficient analyses
determined for measuring distortion between the original image
and fused image in terms of spectral information. The other best
known measurements of fused image quality are the estimation
of Universal Image Quality Index (UIQI) (Wang,2002), and