The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences. Vol. XXXVII. Part B7. Beijing 2008
1278
weighting parameters of a and b for G and B bands are 0.75
and 0.25, respectively.
However, since the above two weighting parameters are totally
depends on the IKONOS imagery, these parameters cannot be
applied to other satellite image fusion. Therefore, a control
parameter, y3, will additionally be suggested in this paper. This
will be selected when the mean value of the difference between
the PAN image and the ft * I' image is a minimum, so as to
minimize the radiance difference between two images.
The proposed hybrid method for IKONOS image fusion is as
follows:
F{R)
F(G)
F{B)
FINIR)
R + ^ W(PAN-^«l') k
k=l
G + V w
^ (PAN-^*I*) k
k=l
B + i W (PAN _,. I%
k-1
NIR + ^ W (PAN; ,,,, )k
(19)
In sum, we employed the FIHS method in order to reduce the
computational cost of the original hybrid method along with the
simplification of the mathematical model, and suggested the
control parameter and the framelet transform in order to
enhance the overall performance of the fused images.
4. EXPERIMENTAL STUDY AND ANALYSIS
To verify the performance of the proposed method, an IKONOS
PAN image and MS images of the Korean city of Daejeon,
which were acquired on 9 March 2002, were used. The
IKONOS imagery contains a lm PAN image and four-band 4 m
MS images. The data for this experiment comprised a PAN
image and four R, G, B, and NIR MS images.
elHS
eSW-
eSWI-
eFSWI-
Wavelet
s
Trous
Framelets
bias (%)
R
16.68
0.00
0.00
0.00
(ideal
G
16.87
0.00
0.00
0.00
value: 0)
B
18.85
0.00
0.00
0.00
NIR
18.81
0.00
0.00
0.00
CC
R
0.95
0.96
0.97
0.97
(ideal
G
0.95
0.96
0.97
0.97
value: 1)
B
0.93
0.95
0.96
0.96
NIR
0.94
0.95
0.96
0.96
SD (%)
R
10.78
9.53
8.51
8.36
(ideal
G
10.39
8.62
7.84
7.66
value: 0)
B
12.12
10.18
9.29
9.15
NIR
11.27
10.13
8.77
8.79
sCC
R
0.99
0.94
0.98
0.98
G
0.99
0.94
0.98
0.98
B
0.99
0.93
0.98
0.98
NIR
0.98
0.95
0.98
0.98
ERGAS
5.26
2.41
2.15
2.12
SAM(deg.)
3.71
3.42
3.10
3.09
Q4
0.58
0.92
0.94
0.94
Table 2. Comparison of IKONOS Fusion Results
In order to assess the quality of the fused MS images, reference
MS images with the same spatial resolution as the PAN image
were needed. However, since such MS images were unavailable,
spatially degraded PAN and MS images, which were generated
by a lowpass filtering and sub-sampling procedure, were
considered (Wald et al., 1997).
4.1 The Quality Indices for Quantitative Analysis
The spectral and spatial quality indices for quantitative analysis
have been used in (Ranchin et al,2000; Alparone et al., 2004):
namely, the bias; the standard deviation (SD); the correlation
coefficient (CC); the relative global dimensional synthesis error,
which is known as the erreur relative globale adimensionelle de
synthèse (ERGAS); the spectral angle mapper (SAM); the
global quality index, Q4; and the spatial correlation index
proposed by Zhou et al. (sCC).
Note that the bias, SD, CC, ERGAS, SAM and Q4 are applied
at degraded scale, but sCC is applied only at full scale without
any degradation.
4.2 Quantitative Analysis
Table 2 shows the results of a comparative analysis of the
IKONOS image fusion with the seven quality indices.
The comparison presented in Table 2 shows that the extended
IHS method (elHS) has lower CC and Q4 values but greater
bias, SD, ERGAS, SAM values than the extended substitutive
wavelet method (eSW-Wavelets). Therefore, the spectral
quality of the images fused by the eSW-Wavelets method is
greater than that of the images fused by the elHS method.
However, since the sCC values of the elHS method are greater
than those of the eSW-Wavelets, the spatial quality of the
images fused by the elHS method is greater than that of the
images fused by the eSW-Wavelets.
On the other hand, the hybrid method between the elHS and
eSW methods (eSWI-Trous) has lower bias, SD, ERGAS, and
SAM values but greater CC, sCC and Q4 values than both the
elHS and the eSW methods. Consequently, the spatial and
spectral quality of the images fused by the eSWI-Trous method
is greater than those of the images fused by both the elHS and
the eSW methods. That is to say, the hybrid fusion method was
able to eliminate drawbacks of the IHS and wavelet-based
methods, while keeping the advantages.
In addition, Table 2 shows that the proposed fast hybrid method
(eFSWI-Framelets) has lower bias, SD, ERGAS, and SAM
values but greater CC, sCC and Q4 values than the eSWI-Trous
method. Therefore, the spatial and spectral quality of the
images fused by the eFSWI-Framelets method is greater than
those of the images fused by the eSW-Trous method. This is
why the control parameter and the framelet transform are
suggested in this paper.
4.3 Visual Analysis
Figure 5 shows the results of full-scale visual fusion. For Fig. 5
(b), aliasing artifacts induced during the interpolation process
are apparently visible. Such impairments disappear in images
fused by the elHS method and are easily explained by Eq. (15),