Full text: Proceedings, XXth congress (Part 4)

bul 2004 
A STUDY OF IMAGE FUSION TECHNIQUES IN REMOTE SENSING 
M. Hahn **, F. Samadzadegan" 
“Dept. of Geomatics, Computer Science and Mathematics, Stuttgart University of Applied Sciences, Stuttgart, Germany - 
michael.hahn G hft-stuttgart.de 
' Dept. of Surveying and Geomatics, Faculty of Engineering, University of Tehran, Tehran, Iran - samadz@ut.ac.ir 
Commission IV, WG IV/7 
KEY WORDS: Image Fusion, Multi Resolution, Remote Sensing, Correlation, Quality 
ABSTRACT: 
The amount and variety of remote sensing imagery of varying spatial resolution is continuously increasing and techniques for 
merging images of different spatial and spectral resolution became widely accepted in practice. This practice, known as data fusion, 
is designed to enhance the spatial resolution of multispectral images by merging a relatively coarse-resolution image with a higher 
resolution panchromatic image taken of the same geographic area. This study examines fused images and their ability to preserve the 
spectral and spatial integrity of the original image. The mathematical formulation of ten data fusion techniques is worked out in this 
paper. Included are colour transformations, wavelet techniques, gradient and Laplacian based techniques, contrast and morphological 
techniques, feature selection and simple averaging procedures. Most of theses techniques employ hierarchical image decomposition 
for fusion. 
[RS-1C and ASTER images are used for the experimental investigations. The panchromatic IRS-1C image has around 5m pixel size, 
the multispectral ASTER images are at a 15m resolution level. For the fusion experiments the three nadir looking ASTER bands in 
the visible and near infrared are chosen. The concept for evaluating the fusion methods is based on the idea to use a reduced 
resolution of the IRS-1C image data at 15m resolution and of the ASTER images at 45m resolution. This maintains the resolution 
ratio between IRS and ASTER and allows comparing the image fusion result at the 15m resolution level with the original ASTER 
images. This statistical comparison reveals differences between all considered fusion concepts. 
INTRODUCTION 
Image fusion on the pixel level is sometimes also called pixel or 
Every year the number of airborne and spaceborne data signal fusion. If image fusion is related to the feature 
acquisition missions grows, producing more and more image representation level of images it is also called feature fusion. 
data about the Earth’s surface. The imagery is recorded with Object or decision fusion deals with the high level 
varying resolution and merging images of different spatial and representation of images by objects. The meaning of the terms 
spectral resolution has become a widely applied procedure in ^ feature and object in image processing (e.g. Gonzalez and 
remote sensing. Many fusion techniques have been proposed for Woods, 2002) and Remote Sensing (e.g. Wald. 2004) is still 
fusing spectral with high spatial resolution image data in order quite different from its use in Photogrammetry (e.g. Schenk, 
to increase the spatial resolution of the multispectral images 2003) and Computer Vision (e.g Haralik and Shapiro, 1992). 
(Carper et al., 1990; Chavez et al., 1991; Kathleen and Philip. As a consequence the features in photogrammetry, in particular 
1994, Wald, 2002). linear features extracted by edge detection schemes and areal 
features based e.g. on a texture segmentation scheme lead to an 
Data fusion as defined by Wald (2004) is a “formal framework image description which is closer to an object description in 
in which are expressed the means and tools for the alliance of image processing or pattern recognition than to a feature 
data originating from different sources. It aims at obtaining description. Image classification performed on a multispectral 
information of greater quality; the exact definition of ‘greater image may take in addition to the spectral data textural features 
quality’ will depend upon the application. This approach is and other feature image descriptions into account. At this point 
based upon the synergy offered by the various sources." the difference between the different uses of terms is getting very 
Focussed to the output of airborne and spaceborne sensors, i.e. obvious. 
recorded images, image fusion or image data fusion is 
concerned with methods for the fusion of images. The emphasis Data fusion in its interrelationship with image analysis and GIS 
In this paper is put on images taken by different sensors with was reviewed in Baltsavias and Hahn (1999). Fusion of data 
different spatial resolutions. The goal is either to visualize the recorded by different sensors has been put into context to data 
origina] sets of images with improved expressiveness regarding in GIS databases. Quite long is the list of problems of fusion 
ts inherently available information, or to produce a new related problems that have been worked out in the above quoted 
Product of synthesized images with a better spatial resolution. paper. The discrepancy between scene representations given by 
The motivation of users of image fusion techniques often imagery and given by corresponding maps or GIS (vector) data 
comprises both aims. sets links fusion concepts to topics of image analysis with the 
ee à 
* Corres ; 
Corresponding author. 
889 
 
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.