Full text: Fusion of sensor data, knowledge sources and algorithms for extraction and classification of topographic objects

International Archives of Photogrammetry and Remote Sensing, Vol. 32, Part 7-4-3 W6, Valladolid, Spain, 3-4 June, 1999 
126 
Due to the frequent cloud cover, visual imagery for the area of 
interest is often not available. In addition, the re-visit time of 
the individual satellites alone is not sufficient to monitor fast 
changes, e.g. as they occur in a harbour such as Rotterdam, the 
Netherlands. Using an image of the relatively high resolution 
Indian Satellite IRS-1C PAN (~ 6 m), the interpretation of an 
oil storage facility containing tanks with floating roofs has been 
performed. The cloud-free image depicts the level of fluid 
available in the different tanks. In case of clouds, this 
information cannot be obtained. Since SAR sensors are active 
instruments with a wavelength that can penetrate clouds, they 
can acquire images at any time of the day or year. For 
comparison, a multi-temporal composite of three different 
Radarsat images has been visually analysed. It leads to the 
assumption that the backscatter returning to the sensor is 
stronger, the lower the fluid level in the tank. The explanation 
lies in the increased comer reflector effect with decreasing fluid 
level (see Fig. 2). Using more imagery and measurements of the 
point targets in the SAR images could lead to the establishment 
of a relationship between strength of backscatter and level of oil 
storage. The available knowledge of the location of the tanks, 
based on one cloud-free optical image, facilitates the approach. 
In another case, a SAR image has been used to study the 
difference of tank covers. The tanks had been identified from 
optical imagery, which was not sufficient to distinguish floating 
from conical roofs. The SAR on the other hand contributed this 
information due to the differences in backscatter from the 
different types of roofs. 
Fig. 2. Backscatter from filled (left) and almost empty (right) oil 
tank. 
The experience at the WEUSC using visual image interpretation 
as major exploitation element has proven that image fusion 
plays a vital role in the facilitation of feature detection, 
recognition and identification. The main criterion for the choice 
of images for image fusion is the contribution of complementary 
information contained in each individual image. If the 
operational process of image fusion succeeds to maintain the 
information contribution of the individual images in the fused 
product, it is not necessary to evaluate the individual images 
alone. The image analyst can understand the feature and its 
context much faster in the fused product than looking at the 
individual images. This speeds up the exploitation process and 
improves the reliability of the obtained interpretation results. 
Image fusion does not produce information that is not already 
contained in the original images. But it forms a mean to 
enhance certain features and their environment in order to draw 
the attention of the human interpreter to relevant aspects. 
The most time consuming part in pixel-based image fusion is 
the image registration, i.e. the identification of GCPs. This is 
especially true for spatially very different data or VER/SAR 
registration. A special registration tool has been developed to 
address this part of the processing chain. This tool decreases the 
time and accuracy needed for registration, because it allows not 
only GCPs, but also linear or area features, in the establishment 
of the geometric model. Furthermore, it introduces a standard in 
the processing chain, which is essential in an operational 
environment. In parallel, the WEUSC allocates resources to 
research and technical development focusing on an automation 
of this process. A first prototype exists integrating sensor 
models and a priori information provided with the image data. 
At the moment the systems can process SPOT, IRS-1, MK-4 
and Radarsat imagery. 
A major element of the operational implementation of image 
fusion with respect to visual image interpretation is the 
interactive component. The operator needs to be capable of 
empirically tuning individual parameters involved in the fusion 
process or in the enhancement of input data as well as the end 
product. 
4. RESULTS 
Operational image exploitation means the achievement of speed 
and quality, as well as reliability of results. However, quality 
and accuracy should be seen in the context of requirements. 
Depending on the application and time constraints resulting 
from the operational environment, images will not always be 
processed to the highest accuracy level. The approach and 
complexity of image processing is defined on the basis of the 
needs expressed. This is very important for an optimization of 
resources. 
A problem that often reduces the speed of processing is the lack 
of availability of ancillary data describing the imagery as well as 
ground truth. The image analyst uses the fusion approach with 
care to fully understand the nature of the fused product in order 
to draw proper conclusions. 
5. CONCLUSIONS 
The experiences gained show very clearly that a major element 
of the operational implementation of image fusion with respect 
to visual image interpretation is the interactive component. The 
operator needs to be capable of empirically tuning individual 
parameters involved in the fusion process or in the enhancement 
of input data as well as the end product. The fine tuning of the 
image enhancement parameters, i.e. histogram value 
distribution, filter, assignment of colours etc., influences the 
success of the fusion itself. A small interactive window, 
containing a representative subset of the image to be processed, 
has proven to be an excellent aid to support the determination of 
the ’right’ values for the parameters. This window shows in real 
time the effect of changes in the parameters on the input data or 
the fused product, depending on the selection of the data to be
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.