Full text: Fusion of sensor data, knowledge sources and algorithms for extraction and classification of topographic objects

International Archives of Photogramme try and Remote Sensing, Vol. 32, Part 7-4-3 W6, Valladolid, Spain, 3-4 June, 1999 
construction of the viewing geometry during image acquisition. 
It allows the correction of terrain induced distortions in case of 
availability of a digital elevation model (DEM) (Cheng et al., 
1995). 
Nevertheless, the combination of data with different spatial 
resolution is of benefit for visual interpretation. In the case of 
large spatial or spectral differences in images to be co-registered 
the identification of features (lines or areas) leads to more 
accurate results than the measurement of points only. The 
information contributed by the two images facilitates the 
extraction and identification of features. The human interpreter 
is able to cope with slight misregistration. 
In spectral terms, the fusion of panchromatic and multispectral 
imagery does not invoke particular problems due to the similar 
nature (visible and infrared spectrum) of the images involved. 
This matter becomes more critical when fusing VIR and SAR. 
The two image types are influenced by different effects both in 
radiometry and geometry. The fusion of SAR/VIR does not 
only result in the combination of disparate data but is also used 
to spatially enhance the imagery involved (Welch, 1984). In 
particular, the effect of speckle in SAR data has a strong impact 
on the interpretability of the fused result. The application of 
speckle filters is a trade-off between speckle reduction and loss 
of detail. 
5. RESOLUTION MERGE EXAMPLES AND 
CONCLUSIONS 
Image fusion is used in a broad variety of applications: geology, 
land use / agriculture / forestry, change detection, map updating, 
hazard monitoring, just to name a few. The possibility to 
increase spatial resolution, whilst maintaining the important 
information source of different spectral bands is of benefit for 
most of these applications. 
There are many publications containing suggestions on how to 
fuse high resolution panchromatic images with lower resolution 
multispectral data to obtain high resolution multispectral 
imagery. Details can be found in Simard 2 (1982), Cliche et al. 
(1985), Pradines (1986), Price (1987), Welch and Ehlers 
(1987), Carper et al. (1990), Ehlers (1991), Mangolini et al. 
(1993), Munechika et al. (1993) and Pellemans et al. (1993). 
It has been shown that even the fusion of spatially very different 
datasets can result in increased interpretability; an operational 
example is the use of space photography and pushbroom 
scanners, e.g. Russian imagery & SPOT XS (Pohl and Touron, 
1999). 
The resolution merge is one of the few generally accepted and 
operational fusion techniques at pixel level. That becomes very 
obvious from the availability of fused products from image 
providers, as well as COTS package processing modules. The 
fused products are suitable for visual interpretation and further 
computer aided processing. The latter is reasonable only for 
those techniques that result in radiometric values that are close 
to the original input data (e.g. ARSIS). 
New and planned satellite systems have already taken into 
account the benefit of resolution merged products by launching 
integrated sensor systems with different spatial resolution 
sensors. Operational satellites are LANDSAT-7, SPOT-4, and 
IRS-1C. ESA is also planning a multisensor, multiresolution 
satellite called ENVISAT. 
REFERENCES 
Carper, W.J., Lillesand, T.M., and Kiefer, R.W., 1990. The use 
of Intensity-Hue-Saturation transformations for merging SPOT 
Panchromatic and multispectral image data. Photogrammetric 
Engineering & Remote Sensing, 56(4), pp. 459-467. 
Chavez, P.S., Sides, S.C., and Anderson, J.A., 1991. 
Comparison of three different methods to merge multiresolution 
and multispectral data: TM & SPOT Pan. Photogrammetric 
Engineering & Remote Sensing, 57(3), pp. 295-303. 
Cheng, P., Toutin, T. and Pohl, C., 1995. A comparison of 
geometric models for multisource data fusion. In: Proceedings 
of Geoinformatics '95 - International Symposium on ‘RS, GIS 
& GPS in Sustainable Development & Environmental 
Monitoring’, Hong Kong, pp. 11-17. 
Chiesa, C.C., and Tyler, W.A., 1990. Data fusion of off-nadir 
SPOT panchromatic images with other digital data sources. In: 
Technical Papers 1990, ACSM-ASPRS Annual Convention, 
Image Processing and Remote Sensing, 4, pp. 86-98. 
Cliche, G., Bonn, F. and Teillet, P., 1985. Integration of the 
SPOT Pan channel into its multispectral mode for image 
sharpness enhancement. Photogrammetric Engineering & 
Remote Sensing, 51(3), pp. 311-316. 
Ehlers, M., 1991. Multisensor image fusion techniques in 
remote sensing. ISPRS Journal of Photogrammetry & RS, 
46(1), pp. 19-30. 
Franklin, S.E. and Blodgett, C.F., 1993. An example of satellite 
multisensor data fusion. Computers & Geoscience, 19(4), pp. 
577-583. 
Gillespie, A.R., Kahle, A. B. and Walker, R.E., 1986. Colour 
enhancement of highly correlated images: I. Decorrelation and 
HSI contrast stretches. Remote Sensing of Environment, 29(20), 
pp. 209-235. 
Hallada, W.A. and Cox, S., 1983. Image sharpening for mixed 
spatial and spectral resolution satellite systems. Proc. of the 
17th International Symposium on Remote Sensing of 
Environment, 9-13 May, pp. 1023-1032. 
2 Simulated data.
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.