25
is composed of the
particles. Since the
icem and become a
»nvergence speed in
Darticularly suitable
this article, we use
ization” (AMO) to
mtation operator is
i crowding distance
a of nondominated
ain the population
a weight is used to
as follows: first,
parameters, then
cle: pop[i\, where
e circumstance we
the record of each
the particles in the
e objective number,
ons that represent
the REP according
:les is reached, do
on as below.
l-pop[i])
c2 are the learning
ues in the range [0,
Wmin is 0.2; the
e maximum cycle
t position that the
aximum crowding
•tide locates in the
pulation diversity;
/. Update the new
aroduced from the
space in case they
:ion variable goes
takes the value of
> multiplied by -1.
s in the POP at a
tides in the POP.
sert all the current
it position of the
in its memory, the
weights at each
1. All approaches
.SURES
always unknown,
signing objective
; would produce is
a very difficult task but such metrics are highly desired. Among
the limited number of methods that have been proposed in the
literature for image fusion quality assessment without an ideal
image, most of them are not very suitable [15, 16]. Some
researchers assess the results by using subjective tests [17].
However, although subjective tests can sometimes be accurate
if performed correctly, they are inconvenient, expensive, and
time consuming. Further, it is impossible to use them to
continually adjust system parameters in a real time manner.
There are a few objective metrics which do not require the
availability of an ideal image in the literature.
In the article, we present representational and some new
quality metrics for the experiments. Some useful conclusion can
draw out through comparing. One type of metrics is Standard
Deviation (SD), Entropy (EN); the other type is cross entropy
(CE), mutual information (MI), and universal index (UI) [18],
which utilizes the features of both the fused and source.
3.1 A Standard Deviation (SD)
As we know, SD can provide some contrast information. For
a fused image of size N X M, its standard deviation can be
estimated by
SD =
N M
/=1 jm 1
where C(i, j) is the (i, j)th pixel intensity value and lfl is the
sample mean of all pixel values of the image. It is known that
SD is composed of two parts, the signal part and the noise part.
This measurement will be more efficient in the absence of noise.
3.2 Entropy (EN)
l FA (/»«)=
PFA(f> a )
P F (f)P A ( a )
1 F B (f’ b )= Y.Pfb(/¿) log 2
f,b
p FB (f’ b )
PÁf)PÁ b )
Performance is measured by the value of
Mlf = l FA (f, a) + \ FB (f ,b)
3.5 Universal Index (UI)
Based on the SSIM measure [20] gives an indication of how
much of the salient information contained in each of the input
images has been transferred into the fused image. First calculate
SSIM (a, /|w) and SSIM (b, /|w) which are the structural
similarity measures between the input images and the fused
image in a local window w. Then a normalized local weight A.
(w) indicate the relative importance of the source images. The
index is calculated by the function
UI = ~X( k (oASSIM(a,f\ m )
\W\ aeW
+ (\-X((ù))SSIM(b,f\(ù))
where SSIM is the structural similarity measure of two
2
sequences, let fl x , G x ,and O xy be the mean of x, the
variance of x, and the covariance of x and y, respectively. Then
SSIM compute as
CT
SSIM = —2L
G y ct
2 M x M y 2a x a y
2 2 2 2
Mx + My CT x + a y
An index to evaluate the information quantity contained in an
image. Entropy has often been used to measure the information
content of an image. Entropy is define as
L-1
£ = -Xft l0 §2 Pi
/=0
where L is the total of grey levels, p-{po, Pi, Pi-i} is the
probability distribution of each level.
3.3 Cross Entropy (CE)
The source images A, B and fused image F, the cross entropy
is defined as (p A is p for image A)
CE= CE(A, F) + CE(B, F)
2
where CE(A, F)(CE(B, F)) is the cross entropy of the source
image A(B) and fused image F
CE(A,F)= f>,(/)log 2 £ig
i=0 P F (l)
CE(B,F)= f> s (/)log 2 MJ
,=o P F ( 0
3.4 Mutual Information (MI)
A higher value of the index indicates that the fused image
contains fairly good quantity of information in both images.
Define the joint histogram of source image A (B) and the fused
image F as P FA (f, a) (P FB (f, b)). The mutual information
between source image and the fused image is [19]
4. EXPERIMENTS AND ANALYZING
We applied the above methodologies and assessment system
to fuse SAR and SPOT panchromatic images which has 5m
pixels (Figure 1). The experiments compared the different
alternatives for each procedure of the generic fusion framework
described in Figure 2. It is worth noticing AMO in method (b)
to search the Pareto optimal weights of the coefficients and
compared the results with popular method (a) in figure 2. The
abbreviations used in the paper are described in table l.The
performance of method (WA-WBA+NG+AMO+RBV) using
different decomposition levels are shown in the table 2. In the
table 2, the first column shows the combinations of alternatives
in figure 2 for the procedures, and the second column lists the
different alternatives MSD for the current procedure. Columns
3-7 show the performance using the criteria we introduced in
Section 3.
(a) SAR (b) SPOT
Figure 1. Source images