Full text: Papers accepted on the basis of peer-reviewed abstracts (Part B)

In: Wagner W., Székely, B. (eds.): ISPRS TC VII Symposium - 100 Years ISPRS, Vienna, Austria, July 5-7, 2010, IAPRS, Voi. XXXVIII, Part 7B 
204 
QUALITY ASSESSMENT OF IMAGE FUSION TECHNIQUES FOR MULTISENSOR 
HIGH RESOLUTION SATELLITE IMAGES (CASE STUDY: IRS-P5 AND IRS-P6 
SATELLITE IMAGES) 
M. Fallah Yakhdani, A. Azizi 
Centre of Excellence for Natural Disaster Management, Department of Geomatics Engineering, 
College of Engineering, University of Tehran, Iran - (mfallah84@gmail.com, aazizi@ut.ac.ir) 
Commission VII, WG VII/6 
KEY WORDS: Fusion, IRS, Multisensor, Spatial, Spectral, Evaluation 
ABSTRACT: 
This paper is concentrated on the evaluation of the image fusion techniques applied on the IRS P5 and P6 satellite images. The study 
area is chosen to cover different terrain morphologies. A good fusion scheme should preserve the spectral characteristics of the 
source multi-spectral image as well as the high spatial resolution characteristics of the source panchromatic image. In order to find 
out the fusion algorithm which is best suited for the P5 and P6 images, five fusion algorithms, such as Standard IHS, Modified IHS, 
PCA, Brovey and wavelet algorithms have been employed and analyzed. In this paper, eight evaluation criteria are also used for 
quantitative assessment of the fusion performance. The spectral quality of fused images is evaluated by the Spectral discrepancy, 
Correlation Coefficient (CC), RMSE and Mean Per Pixel Deviation (MPPD). For the spatial quality assessment, the Entropy, Edge 
detection, High pass filtering and Average Gradient (AG) are applied and the results are analyzed. The analysis indicates that the 
Modified IHS fusion scheme has the best definition as well as spectral fidelity, and has better performance with regard to the high 
textural information absorption. Therefore, as the study area is concerned, it is most suited for the IRS-P5 and P6 image fusion. 
1. INTRODUCTION 
Due to physical constraint, there is a trade off between spatial 
resolution and spectral resolution of a high resolution satellite 
sensor (Aiazzi et al., 2002), i.e., the panchromatic image has a 
high spatial resolution at the cost of low spectral resolution, and 
the multispectral image has high spectral resolution with a low 
spatial resolution (IKONOS: panchromatic image, lm, 
multispectral image 4m; QuickBird: panchromatic image, 
0.62m, multispectral image, 2.48m). To resolve this dilemma, 
the fusion of multispectral and panchromatic images, with 
complementary spectral and spatial characteristics, is becoming 
a promising technique to obtain images with high spatial and 
spectral resolution simultaneously (Gonzalez-Audicana et al., 
2004). Image fusion is widely used to integrate these types of 
data for full exploitation of these data, because fused images 
may provide increased interpretation capabilities and more 
reliable results since data with different characteristics are 
combined. The images varying in spectral, spatial and temporal 
resolution may give a more comprehensive view of the 
observed objects (Pohl and Genderen, 1998). 
2. IMAGE FUSION ALGORITHMS 
Many methods have been developed in the last few years 
producing good quality merged images. The existing image 
fusion techniques can be grouped into four classes: (1) color 
related techniques such as intensity-hue-saturation (IHS) ; (2) 
statistical/numerical methods such as principal components 
analysis (PCA), high pass filtering (HPF), Brovey transform 
(BT), regression variable substitution (RVS) methods; (3) 
Pyramid based Methods such as Laplacian Pyramid, Contrast 
Pyramid, Gradient Pyramid, Morphological Pyramid and 
Wavelet Methods and (4) hybrid methods that use combined 
methods from more than one group such as IHS and wavelet 
integrated method. This study analyzes five current image 
fusion techniques to assess their performance. The five image 
fusion methods used include Standard IHS, Modified IHS, 
PCA, Brovey and wavelet algorithms. 
IHS (Intensity-Hue-Saturation) is the most common image 
fusion technique for remote sensing applications and is used in 
commercial pan-sharpening software. This technique converts a 
color image from RGB space to the IHS color space. Here the 1 
(intensity) band is replaced by the panchromatic image. Before 
fusing the images, the multispectral and the panchromatic image 
are histogram matched. 
Ideally the fused image would have a higher resolution and 
sharper edges than the original color image without additional 
changes to the spectral data. However, because the 
panchromatic image was not created from the same wavelengths 
of light as the RGB image, this technique produces a fused 
image with some color distortion from the original multispectral 
(Choi et al., 2008). There have been various modifications to 
the IHS method in an attempt to fix this problem (Choi et al., 
2008; Strait et al., 2008; Tu et al., 2004; Siddiqui, 2003). In this 
research is used modification method suggested by Siddiqui 
(2003). 
The Principal Component Analysis (PCA) is a statistical 
technique that transforms a multivariate dataset of correlated 
variables into a dataset of new uncorrelated linear combinations 
of the original variables (Pohl and Genderen, 1998). It is 
assumed that the first PC image with the highest variance 
contains the most amount of information from the original 
image and will be the ideal choice to replace the high spatial 
resolution panchromatic image. All the other multispectral 
bands are unaltered. An inverse PCA transform is performed on 
the modified panchromatic and multispectral images to obtain a 
high-resolution pan-sharpened image.
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.