×

You are using an outdated browser that does not fully support the intranda viewer.
As a result, some pages may not be displayed correctly.

We recommend you use one of the following browsers:

Full text

Title
Mapping without the sun
Author
Zhang, Jixian

217
image, with a comprehensive considering of the subjective and
objective evaluation methods.
2.3 Structural Similarity
During observing images, what the human eyes practically
extracted are not the error between image pixels but the image
structure information. And, human visual system can self-
adaptively extract the structure information in image
background. At present, many researchers have pointed out that
the structure distortion is the most important factor in image
quality assessment. This viewpoint has given a new direction
for now image quality assessment research, and already
achieved much research production (Wang, 2002; Xydaes,
2000; Di, 2006). This paper adopted the formulation as
following to describe the structure similarity between the
original and fusion images.
Suppose A, B represents respectively the original and fusion
image, then the structure similarity (SS) between them will be
defined as:
SS AB ~ L A b ’ Cab ' S ab ~
2 MaMb 2 °a°b °ab (1)
MA + Mb ^B ®A®B
Where, there are:
In the above formula, p A and u B are respectively the mean
value of the image window; o 2 A and a 2 B are respectively the
variance of the image window; a AB is the covariance of A and
B image window data; L AB , C AB , S AB is respectively the
compare items of lightness, contrast, and structure between the
two images, with values between 0 and 1.
The structure similarity value is between [-1.1], and with a
value close to 1, the fusion image will have a higher quality. It
combines the image structure information and human visual
system characteristic to evaluate the fusion image quality, with
a better effect than the other subjective or objective index
separately.
3. THE COMPREHENSIVE EVALUATION MODEL
FOR THE FUSION IMAGE
In the processing of fusion image quality evaluation, we must
consider the relationship among the original image A and B
and the fusion image F, to construct the corresponding
evaluation index, because the fusion result image derives of the
two original remote sensing data. In this paper, we adopt the
construction function E (A, B, F) to evaluate the fusion image
quality synthetically (Hu, 2004), which is as following:
E(A,B,F)=X a SS af +X b SS bp
; = s a(v) (3)
s A {o))+s B {(o)
A B = 1 —A A
Where, A A and A B are respectively the SS weight value of the
original and fusion images; S A («) and S B («) are the variance
value of the original images A and B. The above formula can
synthetically evaluate the structure similarity between the
fusion and original images. But, in view of the characteristics
of high-resolution airborne SAR and SPOT5 images, the
human visual system will have different visual levels for
different ground object types, especially for the trees, building,
water, road, land cover and mountain shadow and so on.
Considering this, we put forward a kind of comprehensive
fusion quality (CFQ) evaluation model, according to the
interpreting characteristic of the different ground object types,
with a formulation as:
CFQ = fl& k E k (A,B,F)=il 6 t [i,® „ (*)+ X,SS tp (*)] (4)
k = 1,2,3 K
Where, K represents the sum number of types discussed in the
fusion result image; 0 k is the corresponding weight value for
each ground object type.
4. TEST AND RESULT ANALYSIS
In this paper, we take high-resolution airborne SAR, with a
resolution of 1 meter, and 10-meter resolution multi-spectral
SPOT5 images as the test dataset. And, the latter may
combined band 4, 2 and 1 into a pseudo-colour image. The
fusion image of airborne SAR and SPOT5 data has just shown
in Figure 1.
Figure 1 the fusion result image from airborne SAR and
SPOT5
During the data processing, we chose 5 types of ground objects
which were often applied in many practical fields, involving
buildings, trees, water, road, and land use. From each type, we
take 3 samples in the original and fusion result images,
utilizing ERDAS software function to obtain the statistical
mean value and variance values for each sample window. So,
we can discuss the different fusion effect for each type of
ground objects, and get the final fusion quality evaluation using
the CFQ model described as the above. And, the SS and test
result for each type has been seen in Table 1 as following:
type
SAR
SS
0.9389
E
CFQ
0.804
Building
SPOT5
SAR
0.926
0.7338
0.9328
trees
SPOT5
0.826
0.7777
Land use
SAR
0.8689
0.896