Full text: Resource and environmental monitoring

  
> around a 
and simu- 
n 4m. 
  
3 DATA FUSION BETWEEN 1 m PANCHROMATIC 
AND 4m COLOR IMAGERY 
For local environmental applications we need both the 
spectral resolution (i.e., at least three distinct spectral 
bands such as G,R,NIR) and good spatial resolution. 
Therefore, a fusion between the well resolved panchro- 
matic and the four times coarser multispectral satellite 
images is necessary. 
Numerous efforts have been made to fuse data received 
from the same scene by different sensors (see e.g. Shen 
et al. (1994), Darvishsefat (1995), Zhukov et al. (1995), 
Patterson et al. (1996), Peytavin (1996), Zhukov et al. 
(1996)). Particular respect has been paid to the merg- 
ing of image data with differing spatial resolution. If 
the two sensors do not operate from the same platform, 
the geometric rectification and registration of the images 
is a prerequisite to data fusion. Experience has shown 
this to be a relatively easy process for satellite imagery 
(stable orbits and altitudes, small swath angles), but a 
rather cumbersome procedure for airborne line scanner 
imagery (flight track and altitude variations). In the par- 
ticular case of the expected high resolution satellites this 
is not an issue since both the panchromatic and the mul- 
tispectral image data are recorded from the same plat- 
form at the same time.” 
3.1 Interpolation of the MS4,-Imagery 
The first step is the magnification of the 4 m pixel multi- 
spectral image by a factor 4 in order to yield a multispec- 
tral 4 m resolution MSy,,-image with the same number of 
pixels as the panchromatic 1 m resolution PAN, ,-image. 
This requires a resampling where the missing pixels are 
filled with nearest neighbor values, so that the resulting 
MSsm-image shows clearly the underlying 4 x 4 pixel 
structure (see Fig 1, lower image). Alternatively, using 
an interpolation scheme for computation of the missing 
pixels instead of using nearest neighbor values yields a 
much more satisfactory MS;nterp-image (see Fig. 3, top). 
For this paper we employed two-dimensional interpola- 
tion using tensor-product cubic B-splines (de Boor 1978). 
3.2 Fusion by IHS Transformation 
An often applied fusion procedure is the merging of 
panchromatic with three-color RGB imagery in the IHS 
(or HSV) color space (Kraus 1990, Albertz 1991). The 
RGB image is transformed into Intensity, Hue, and Sat- 
uration; the Intensity is replaced by the high resolution 
panchromatic image; and then the image is transformed 
back into the RGB color space. This technique is known 
to work well for moderate resolution ratios (such as 1:3 
for SPOT + LANDSAT TM, 10m and 30m). The results 
are still helpful but less reliable for resolution ratios 
such as 1:20, e.g. for fusion of SPOT color images with 
panchromatic aerial photography (Ersbgll et al. 1997). 
We note, however, that fusion by HSV transformation 
can be applied only to multispectral imagery consisting 
of three bands, since the image has to be coded as an 
RGB image before fusion can take place. Moreover, Prinz 
et al. (1997) have shown that the IHS-fusion results are 
clearly inferior to fusion using relative spectral contribu- 
tions. 
3.3 Fusion by Relative Spectral Contributions 
A simple fusion method which preserves the relative 
spectral contributions of each pixel but replaces its over- 
all brightness by the high resolution panchromatic im- 
age works as follows: 
  
(1) PSM;.,. = MSinterp * = um (for each pixel and 
\nterP each spectral band) 
* (MS4m) 
2 m=P m'a 
(2) PSM, SM, (PSMz,.) 
where PSM1m is the resulting panchromatic-sharpened 
multispectral 1m image, MSjnterp is the cubic spline- 
interpolated image of a particular spectral band of the 
multispectral image set, PAN, is-the high resolution 
panchromatic image, and PANiuterp is a panchromatic im- 
age created by averaging the three bands of the interpo- 
lated multispectral image MSinterp. 
The first equation describes the replacement of the 
coarse resolved panchromatic image PANinterp by the fine 
resolved panchromatic image PAN;,. Then, for each 
spectral band, the temporary result PSM1,,, is adjusted 
such that the mean spectral value (PSMip) of the final 
fusion result PSM 1m is the same as the mean (MS4m) for 
the spectral band of the original MS4n-image. 
In the first equation the quantity MSinterp/ PANinterp de- 
scribes the relative contribution of each spectral band 
to the overall brightness of a pixel. This relative con- 
tribution, i.e., the chromatic information, is preserved 
by the multiplication with the spatially better resolved 
panchromatic PAN;,,-image. 
The result of an application of this technique to the ex- 
amplary image shown here can be seen in Fig. 3 (lower 
image). 
3.4 Spectral Class Specific Fusion 
The simple fusion method described above can be re- 
fined by a modification of equation (2). First the original 
coarsely resolved multispectral MSn,-image is classified 
into k spectral classes using an unsupervised k-means 
clustering algorithm (Richards 1993, Wiemker 1997). 
The alignment of the mean spectral values is then car- 
ried out for each spectral band and each spectral class we 
individually (c = 1...k). Let ri(z.) be the reflectance 
value of spectral band £ for a specific pixel x. € we 
belonging to class w. . Then the reflectance values are 
aligned to the mean value (r;(MS44)).« of their specific 
spectral class we : 
(3) ri(PSMın) = ré(PSMim) + EME 
Intemational Archives of Photogrammetry and Remote Sensing. Vol. XXXII, Part 7, Budapest, 1998 287 
  
  
 
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.