COMPARISON OF PANSHARPENING ALGORITHMS FOR COMBINING RADAR AND
MULTISPECTRAL DATA
S. Klonus 3
a Institute for Geoinformatics and Remote Sensing, University of Osnabrück, 49084 Osnabrück, Germany -
sklonus@igf.uni-osnabrueck.de
Youth Forum
KEY WORDS: Image Processing, Sharpening, Image Understanding, Fusion, Environmental Monitoring, Radar, Colour,
Multisensor
ABSTRACT:
Iconic image fusion is a technique that is used to combine the spatial structure of a high resolution panchromatic image with the
spectral information of a lower resolution multispectral image to produce a high resolution multispectral image. This process is often
referred to as pansharpening. In this study, image data of the new RADAR satellite TerraSAR-X are used to sharpen optical
multispectral data. To produce these images, use is made of the Ehlers fusion, a fusion technique that is developed for preserving
maximum spectral information. The Ehlers Fusion is modified to integrate radar data with optical data. The results of the modified
Ehlers fusion are compared with those of other standard fusion techniques such as Brovey, Principal Component, and with recently
developed fusion techniques such as Gram-Schmidt, UNB, wavelet based fusion and CN-Spectral Sharpening. The evaluation is
based on the verification of the preservation of spectral characteristics and the improvement of the spatial resolution. The results
show that most of the fusion methods are not capable to integrate TerraS AR-X data into multispectral data without color distortions.
The result is confirmed by statistical analysis.
KURZFASSUNG:
Ikonische Bildfusion ist eine Technik, um die räumliche Struktur von hochaufgelösten panchromatischen Bilddaten mit den
spektralen Informationen eines niedriger aufgelösten Multispektralbildes zu kombinieren, um ein hochaufgelöstes multispektrales
Bild zu erhalten. Dieser Prozess wird auch „Pansharpening“ genannt. In dieser Untersuchung werden Bilddaten des neuen RADAR
Satelliten TerraSAR-X verwendet, um die geometrische Auflösung der optischen multispektralen Daten zu verbessern. Um diese
Bilder zu erstellen, wird die Ehlers Fusion verwendet. Dieses Fusionsverfahren wurde speziell zur bestmöglichen Erhaltung der
spektralen Informationen entwickelt. Die Ehlers Fusion wurde modifiziert, um RADAR Daten in optische Daten zu integrieren. Die
Resultate der modifizierten Ehlers Fusion wurden mit Standard-Fusionstechniken wie der Brovey Transformation oder dem
Principal Component Verfahren und auch mit aktuelleren weiter entwickelten Fusionsverfahren, wie Gram-Schmidt, UNB, Wavelet
basierter Fusion und Color-Normalized Spectral Sharping verglichen. Die Evaluierung der Ergebnisse basiert auf der Untersuchung
der Erhaltung der spektralen Charakteristiken und der Verbesserung der geometrischen Auflösung. Die Ergebnisse zeigen, dass die
Fusionsverfahren überwiegend daran scheitern, die TerraSAR-X Daten in die multispektralen Daten ohne Farbveränderungen zu
integrieren. Die quantitativ-statistischen Ergebnisse bestätigen diese Aussage.
1. INTRODUCTION
Image fusion is a technique that is used to combine the spatial
structure of a high resolution panchromatic image with the
spectral information of a lower resolution multispectral image
to produce a high resolution multispectral image. This process
is often referred to as pansharpening.
In this study, image data of the new RADAR satellite
TerraSAR-X are used to sharpen optical multispectral data.
TerraSAR-X is the first non-military RADAR satellite which
provides data with a ground resolution of 1 m. The opportunity
to acquire images independent of any illumination by the sun
and independent of weather conditions such as, for example,
cloud coverage allows measurements at any time of day or
night. Fusion with multispectral image data from other dates
can make it possible to produce higher resolution color images,
even under clouded skies or adverse weather conditions. These
enhanced images can be submitted to rescue staff in conflict
areas caused by disaster such as earthquakes, tsunamis or
flooding. With this information, for example, it will be easier
for rescue forces to identify the most affected areas, the extent
and degree of damage and site accessibility.
Many other publications have already focused on how to fuse
high resolution panchromatic images with lower resolution
multispectral data to obtain high resolution multispectral
imagery while retaining the spectral characteristics of the
multispectral data (see, for example, Welch and Ehlers 1987 or
Gonzalez-Audicana et al. 2006). Fewer publications focus on
the use of SAR data for Fusion. Ehlers (1991) showed that
fused SIR-B and Landsat TM data improved the quality for
vegetation mapping. Riccietti (2001) used SAR data as a
panchromatic input for image fusion with optical data. He used
the SAR image to fuse it with Landsat TM data. Chibani (2006)
used Spot panchromatic and SAR data to integrate this
information into multispectral Spot data.