Full text: Proceedings; XXI International Congress for Photogrammetry and Remote Sensing (Part B7-3)

The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences. Vol. XXXVII. Part B7. Beijing 2008 
Blue, greed, red band (before, after fusion 
Blue, green, infrared band (before, after fusion) 
Figure 3. Input multispectral image (left) and fusion results 
(right) with the criteria-based approach 
The criteria-based fusion method gives satisfactory fusion 
methods. Visual evaluation of Figure 3 shows that it produces 
appealing results when both spatial detail enhancement and the 
colour quality assurance are considered for the fused images. 
Because the fusion results are obtained based on pre-defmed 
criteria its quality and properties are known. This can be treated 
as a general framework for image fusion, where users can 
design their own fusion tools based on pre-selected criteria. 
The proposed methods are also evaluated quantitatively. A N 
band multispectral image is composed of spectral vectors whose 
elements consist of the gray values corresponding to the same 
pixel location on each band. SAM (Spectral Angular Mapper) 
denotes the absolute value of the angle between two spectral 
vectors in two image pairs. If the angle between these vectors is 
zero, then there is no spectral distortion between images. SAM 
is calculated in terms of degree or radians and averaged over 
the entire images to represent a global metric about spectral 
quality of the fused images (Alparone et al.,2007) 
SAM = arccos 
< v,v > 
(10) 
where V and V are the spectral vectors in each pixel location 
in the multispectral and the fused images. Two spectral vectors 
in the original and the fused images may be parallel, but if their 
magnitudes are different, then radiometric distortion is 
introduced. The shortcoming of the SAM is that it is not 
capable of determining radiometric distortion in the fused 
images. 
Wald,(2002) proposed a metric called ERGAS (“Erreur 
Relative Globale Adimensionnelle de Synthèse” in french) 
which means “relative dimensionless global error in synthesis”. 
ERGAS index is given by (Alparone et al.,2007) 
ERGAS = 100-, 
/ 
1 yf 
' RMSE(k ) ) 
n U 
) 
where h and / are the spatial resolutions of the panchromatic 
and the multispectral images, respectively. As an example, for 
QuickBird panchromatic and multispectral images, h!I is 1 /4 . 
N is the total number of the multispectral bands and /u(k) is 
the mean of the k-th multispectral band. RMSE(k) is 
calculated between the k-th original and fused bands. Thus, 
ERGAS could consider the difference in the mean values of the 
fused and reference images, and catches any possible 
radiometric distortion. 
Alparone et al.,(2004) proposed an index called Q4 to assess the 
quality of four band multispectral images by generalizing the Q 
index initially proposed by (Wang and Bovik,2002) for 
monochromatic images. Q4 is obtained by calculating the 
correlation coefficient between quaternions as 
04 = 
I °’z i z 2 I 
2g »l ' a z2 2-1 Z, I -1 z 2 
(12) 
(O'z. ) 2 + (O’*, ) 2 I Z, I 2 + I Z 2 
The gray values of each spectral vector in the four-band 
reference and fused images constitute the real part of each 
quaternion z } and z 2 , respectively. |<r Z|Z2 | is the modulus of 
the covariance between Zi and z 9 , a, and <r are the 
i ^ z 1 z 2 
variances of z, and z 2 , and | Zj | and | z 2 | are the modulus of 
the expectations of z } andz 2 . In Eq. 12, the first component is 
the hypercomplex correlation coefficient between the two 
spectral pixel vectors. The second and third components 
measure the contrast changes and mean bias on all bands 
simultaneously (Alparone et al.,2004). For this reason, Q4 is the 
most complete index to evaluate the fusion results in terms of 
both spatial and spectral quality. The range of the Q4 index is 
[0, 1], where 1 denotes that two images are identical. 
In the quantitative evaluation, ERGAS and Q4 needs a 
reference multispectral image at the resolution of the fused 
images. However, there is no reference multispectral image at 
high spatial resolution prior to fusion. To solve this problem, 
(Laporterie-Dejean et al.,2005) obtain the reference images by 
simulating the sensor with high resolution data from an airborne 
platform (Alparone et al.,2007). On the other hand, (Wald,2002) 
degrade down the original panchromatic and multispectral 
images to a lower resolution in order to compare the fused 
product to the original multispectral image (Alparone et 
1061
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.