A FUSION METHOD OF SAR
AND OPTICAL IMAGES FOR URBAN OBJECT EXTRACTION
Jia Yonghong a,b,c , Rick S. Blum c
a School of Remote Sensing and Information Engineering, Wuhan University, Wuhan, China - yhjia2000@sina.com
b State Key Laboratory of Information Engineering in Surveying, Mapping and Remote Sensing, Wuhan University,
Wuhan, China - yhjia2000@sina.com
C Electrical and Computer Engineering Dept, Lehigh University, Bethlehem, PA USA - rblum@eecs.lehigh.edu
Commission VII, WG VII/6
KEY WORDS: Satellite remote sensing, Sharpening, Image understanding, Fusion, Land cover.
ABSTRACT:
A new image fusion method of SAR, Panchromatic (Pan) and multispectral (MS) data is proposed. First of all, SAR texture is
extracted by ratioing the despeckled SAR image to its low pass approximation, and is used to modulate high pass details extracted
from the available Pan image by means of the a trous wavelet decomposition. Then, high pass details modulated with the texture is
applied to obtain the fusion product by high pass filter based on modulation (HPFM) fusion method. A set of image data including
co-registered Landsat TM, ENVISAT SAR and SPOT Pan is used for the experiment. The results demonstrate accurate spectral
preservation on vegetated regions, bare soil, and also on textured areas (buildings and road network) where SAR texture information
enhances the fusion product, and the proposed approach is effective for image interpret and classification.
1. INTRODUCTION
Image fusion is capable of integrating different imagery data
creating more information than that from a single sensor, and it
has received tremendous attention in the remote sensing
literature. Many image fusion algorithms and software tools
have been developed, such as the IHS (Intensity, Hue,
Saturation), PCA (Principal Components Analysis), SVR
(Synthetic Variable Ratio) and wavelet based fusion[l].
However, such available algorithms are not efficient for the
fusion of SAR and optical images any more. In an urban area,
many land cover types/surface materials are spectrally similar.
This makes it extremely difficult to analyze an urban scene
using a single sensor[2][3]. Some of these features can be
discriminated in a radar image based on their dielectric
properties and surface roughness. The objective of our study is
to present a novel image fusion method of SAR, Panchromatic
(PAN) and multispectral (MS) data for urban object extraction.
SAR texture is extracted by ratioing the despeckled SAR image
to its low pass approximation, and is used to modulate high pass
details extracted from the available Pan image by means of the
a trous wavelet decomposition. High pass details modulated
with the SAR texture is applied with HPFM (High Pass Filter-
based Modulation) to obtain the fusion product. The following
is introduction of the proposed fusion method.
2. METHODOLOGY
2.1 A trous wavelet
Wavelet transform produces the images in different resolution.
Wavelet representation refers to both spatial and frequency
space. It can show a good position of an image in spatial and
frequency space[4].
There are different approaches to do wavelet decomposition.
One of them is Mallat algorithm which can use wavelet
function such as Daubechies functions. Here we use the a trous
algorithm, which uses dyadic wavelet to merge non-dyadic data
in a simple and efficient procedure. In this algorithm for the
discrete wavelet transform we must do the successive
convolution with a filter. To convolve the image and the filter,
we use convolution function directly. In each step we get a
version of the image I u I 2 ,.... The wavelet coefficient is
defined as the following
wc L = I L . r I L L= 1,2,...,« (1)
If we decompose an image I into wavelet coefficients, then we
can write
n
l =Y. WC L +I r ( 2 )
1=1