Retrodigitalisierung Logo Full screen
  • First image
  • Previous image
  • Next image
  • Last image
  • Show double pages
Use the mouse to select the image area you want to share.
Please select which information should be copied to the clipboard by clicking on the link:
  • Link to the viewer page with highlighted frame
  • Link to IIIF image fragment

Fusion of sensor data, knowledge sources and algorithms for extraction and classification of topographic objects

Access restriction

There is no access restriction for this record.

Copyright

CC BY: Attribution 4.0 International. You can find more information here.

Bibliographic data

fullscreen: Fusion of sensor data, knowledge sources and algorithms for extraction and classification of topographic objects

Monograph

Persistent identifier:
856473650
Author:
Baltsavias, Emmanuel P.
Title:
Fusion of sensor data, knowledge sources and algorithms for extraction and classification of topographic objects
Sub title:
Joint ISPRS/EARSeL Workshop ; 3 - 4 June 1999, Valladolid, Spain
Scope:
III, 209 Seiten
Year of publication:
1999
Place of publication:
Coventry
Publisher of the original:
RICS Books
Identifier (digital):
856473650
Illustration:
Illustrationen, Diagramme, Karten
Language:
English
Usage licence:
Attribution 4.0 International (CC BY 4.0)
Publisher of the digital copy:
Technische Informationsbibliothek Hannover
Place of publication of the digital copy:
Hannover
Year of publication of the original:
2016
Document type:
Monograph
Collection:
Earth sciences

Chapter

Title:
TECHNICAL SESSION 5 FUSION OF VARIABLE SPATIAL / SPECTRAL RESOLUTION IMAGES
Document type:
Monograph
Structure type:
Chapter

Chapter

Title:
OPERATIONAL APPLICATIONS OF MULTI-SENSOR IMAGE FUSION. C. Pohl, H. Touron
Document type:
Monograph
Structure type:
Chapter

Contents

Table of contents

  • Fusion of sensor data, knowledge sources and algorithms for extraction and classification of topographic objects
  • Cover
  • ColorChart
  • Title page
  • CONTENTS
  • PREFACE
  • TECHNICAL SESSION 1 OVERVIEW OF IMAGE / DATA / INFORMATION FUSION AND INTEGRATION
  • DEFINITIONS AND TERMS OF REFERENCE IN DATA FUSION. L. Wald
  • TOOLS AND METHODS FOR FUSION OF IMAGES OF DIFFERENT SPATIAL RESOLUTION. C. Pohl
  • INTEGRATION OF IMAGE ANALYSIS AND GIS. Emmanuel Baltsavias, Michael Hahn,
  • TECHNICAL SESSION 2 PREREQUISITES FOR FUSION / INTEGRATION: IMAGE TO IMAGE / MAP REGISTRATION
  • GEOCODING AND COREGISTRATION OF MULTISENSOR AND MULTITEMPORAL REMOTE SENSING IMAGES. Hannes Raggam, Mathias Schardt and Heinz Gallaun
  • GEORIS : A TOOL TO OVERLAY PRECISELY DIGITAL IMAGERY. Ph.Garnesson, D.Bruckert
  • AUTOMATED PROCEDURES FOR MULTISENSOR REGISTRATION AND ORTHORECTIFICATION OF SATELLITE IMAGES. Ian Dowman and Paul Dare
  • TECHNICAL SESSION 3 OBJECT AND IMAGE CLASSIFICATION
  • LANDCOVER MAPPING BY INTERRELATED SEGMENTATION AND CLASSIFICATION OF SATELLITE IMAGES. W. Schneider, J. Steinwendner
  • INCLUSION OF MULTISPECTRAL DATA INTO OBJECT RECOGNITION. Bea Csathó , Toni Schenk, Dong-Cheon Lee and Sagi Filin
  • SCALE CHARACTERISTICS OF LOCAL AUTOCOVARIANCES FOR TEXTURE SEGMENTATION. Annett Faber, Wolfgang Förstner
  • BAYESIAN METHODS: APPLICATIONS IN INFORMATION AGGREGATION AND IMAGE DATA MINING. Mihai Datcu and Klaus Seidel
  • TECHNICAL SESSION 4 FUSION OF SENSOR-DERIVED PRODUCTS
  • AUTOMATIC CLASSIFICATION OF URBAN ENVIRONMENTS FOR DATABASE REVISION USING LIDAR AND COLOR AERIAL IMAGERY. N. Haala, V. Walter
  • STRATEGIES AND METHODS FOR THE FUSION OF DIGITAL ELEVATION MODELS FROM OPTICAL AND SAR DATA. M. Honikel
  • INTEGRATION OF DTMS USING WAVELETS. M. Hahn, F. Samadzadegan
  • ANISOTROPY INFORMATION FROM MOMS-02/PRIRODA STEREO DATASETS - AN ADDITIONAL PHYSICAL PARAMETER FOR LAND SURFACE CHARACTERISATION. Th. Schneider, I. Manakos, Peter Reinartz, R. Müller
  • TECHNICAL SESSION 5 FUSION OF VARIABLE SPATIAL / SPECTRAL RESOLUTION IMAGES
  • ADAPTIVE FUSION OF MULTISOURCE RASTER DATA APPLYING FILTER TECHNIQUES. K. Steinnocher
  • FUSION OF 18 m MOMS-2P AND 30 m LANDS AT TM MULTISPECTRAL DATA BY THE GENERALIZED LAPLACIAN PYRAMID. Bruno Aiazzi, Luciano Alparone, Stefano Baronti, Ivan Pippi
  • OPERATIONAL APPLICATIONS OF MULTI-SENSOR IMAGE FUSION. C. Pohl, H. Touron
  • TECHNICAL SESSION 6 INTEGRATION OF IMAGE ANALYSIS AND GIS
  • KNOWLEDGE BASED INTERPRETATION OF MULTISENSOR AND MULTITEMPORAL REMOTE SENSING IMAGES. Stefan Growe
  • AUTOMATIC RECONSTRUCTION OF ROOFS FROM MAPS AND ELEVATION DATA. U. Stilla, K. Jurkiewicz
  • INVESTIGATION OF SYNERGY EFFECTS BETWEEN SATELLITE IMAGERY AND DIGITAL TOPOGRAPHIC DATABASES BY USING INTEGRATED KNOWLEDGE PROCESSING. Dietmar Kunz
  • INTERACTIVE SESSION 1 IMAGE CLASSIFICATION
  • AN AUTOMATED APPROACH FOR TRAINING DATA SELECTION WITHIN AN INTEGRATED GIS AND REMOTE SENSING ENVIRONMENT FOR MONITORING TEMPORAL CHANGES. Ulrich Rhein
  • CLASSIFICATION OF SETTLEMENT STRUCTURES USING MORPHOLOGICAL AND SPECTRAL FEATURES IN FUSED HIGH RESOLUTION SATELLITE IMAGES (IRS-1C). Maik Netzband, Gotthard Meinel, Regin Lippold
  • ASSESSMENT OF NOISE VARIANCE AND INFORMATION CONTENT OF MULTI-/HYPER-SPECTRAL IMAGERY. Bruno Aiazzi, Luciano Alparone, Alessandro Barducci, Stefano Baronti, Ivan Pippi
  • COMBINING SPECTRAL AND TEXTURAL FEATURES FOR MULTISPECTRAL IMAGE CLASSIFICATION WITH ARTIFICIAL NEURAL NETWORKS. H. He , C. Collet
  • TECHNICAL SESSION 7 APPLICATIONS IN FORESTRY
  • SENSOR FUSED IMAGES FOR VISUAL INTERPRETATION OF FOREST STAND BORDERS. R. Fritz, I. Freeh, B. Koch, Chr. Ueffing
  • A LOCAL CORRELATION APPROACH FOR THE FUSION OF REMOTE SENSING DATA WITH DIFFERENT SPATIAL RESOLUTIONS IN FORESTRY APPLICATIONS. J. Hill, C. Diemer, O. Stöver, Th. Udelhoven
  • OBJECT-BASED CLASSIFICATION AND APPLICATIONS IN THE ALPINE FOREST ENVIRONMENT. R. de Kok, T. Schneider, U. Ammer
  • Author Index
  • Keyword Index
  • Cover

Full text

International Archives of Photogrammetry and Remote Sensing, Vol. 32, Part 7-4-3 W6, Valladolid, Spain, 3-4 June, 1999 
126 
Due to the frequent cloud cover, visual imagery for the area of 
interest is often not available. In addition, the re-visit time of 
the individual satellites alone is not sufficient to monitor fast 
changes, e.g. as they occur in a harbour such as Rotterdam, the 
Netherlands. Using an image of the relatively high resolution 
Indian Satellite IRS-1C PAN (~ 6 m), the interpretation of an 
oil storage facility containing tanks with floating roofs has been 
performed. The cloud-free image depicts the level of fluid 
available in the different tanks. In case of clouds, this 
information cannot be obtained. Since SAR sensors are active 
instruments with a wavelength that can penetrate clouds, they 
can acquire images at any time of the day or year. For 
comparison, a multi-temporal composite of three different 
Radarsat images has been visually analysed. It leads to the 
assumption that the backscatter returning to the sensor is 
stronger, the lower the fluid level in the tank. The explanation 
lies in the increased comer reflector effect with decreasing fluid 
level (see Fig. 2). Using more imagery and measurements of the 
point targets in the SAR images could lead to the establishment 
of a relationship between strength of backscatter and level of oil 
storage. The available knowledge of the location of the tanks, 
based on one cloud-free optical image, facilitates the approach. 
In another case, a SAR image has been used to study the 
difference of tank covers. The tanks had been identified from 
optical imagery, which was not sufficient to distinguish floating 
from conical roofs. The SAR on the other hand contributed this 
information due to the differences in backscatter from the 
different types of roofs. 
Fig. 2. Backscatter from filled (left) and almost empty (right) oil 
tank. 
The experience at the WEUSC using visual image interpretation 
as major exploitation element has proven that image fusion 
plays a vital role in the facilitation of feature detection, 
recognition and identification. The main criterion for the choice 
of images for image fusion is the contribution of complementary 
information contained in each individual image. If the 
operational process of image fusion succeeds to maintain the 
information contribution of the individual images in the fused 
product, it is not necessary to evaluate the individual images 
alone. The image analyst can understand the feature and its 
context much faster in the fused product than looking at the 
individual images. This speeds up the exploitation process and 
improves the reliability of the obtained interpretation results. 
Image fusion does not produce information that is not already 
contained in the original images. But it forms a mean to 
enhance certain features and their environment in order to draw 
the attention of the human interpreter to relevant aspects. 
The most time consuming part in pixel-based image fusion is 
the image registration, i.e. the identification of GCPs. This is 
especially true for spatially very different data or VER/SAR 
registration. A special registration tool has been developed to 
address this part of the processing chain. This tool decreases the 
time and accuracy needed for registration, because it allows not 
only GCPs, but also linear or area features, in the establishment 
of the geometric model. Furthermore, it introduces a standard in 
the processing chain, which is essential in an operational 
environment. In parallel, the WEUSC allocates resources to 
research and technical development focusing on an automation 
of this process. A first prototype exists integrating sensor 
models and a priori information provided with the image data. 
At the moment the systems can process SPOT, IRS-1, MK-4 
and Radarsat imagery. 
A major element of the operational implementation of image 
fusion with respect to visual image interpretation is the 
interactive component. The operator needs to be capable of 
empirically tuning individual parameters involved in the fusion 
process or in the enhancement of input data as well as the end 
product. 
4. RESULTS 
Operational image exploitation means the achievement of speed 
and quality, as well as reliability of results. However, quality 
and accuracy should be seen in the context of requirements. 
Depending on the application and time constraints resulting 
from the operational environment, images will not always be 
processed to the highest accuracy level. The approach and 
complexity of image processing is defined on the basis of the 
needs expressed. This is very important for an optimization of 
resources. 
A problem that often reduces the speed of processing is the lack 
of availability of ancillary data describing the imagery as well as 
ground truth. The image analyst uses the fusion approach with 
care to fully understand the nature of the fused product in order 
to draw proper conclusions. 
5. CONCLUSIONS 
The experiences gained show very clearly that a major element 
of the operational implementation of image fusion with respect 
to visual image interpretation is the interactive component. The 
operator needs to be capable of empirically tuning individual 
parameters involved in the fusion process or in the enhancement 
of input data as well as the end product. The fine tuning of the 
image enhancement parameters, i.e. histogram value 
distribution, filter, assignment of colours etc., influences the 
success of the fusion itself. A small interactive window, 
containing a representative subset of the image to be processed, 
has proven to be an excellent aid to support the determination of 
the ’right’ values for the parameters. This window shows in real 
time the effect of changes in the parameters on the input data or 
the fused product, depending on the selection of the data to be
	        

Cite and reuse

Cite and reuse

Here you will find download options and citation links to the record and current image.

Monograph

METS MARC XML Dublin Core RIS Mirador ALTO TEI Full text PDF DFG-Viewer OPAC
TOC

Chapter

PDF RIS

Image

PDF ALTO TEI Full text
Download

Image fragment

Link to the viewer page with highlighted frame Link to IIIF image fragment

Citation links

Citation links

Monograph

To quote this record the following variants are available:
Here you can copy a Goobi viewer own URL:

Chapter

To quote this structural element, the following variants are available:
Here you can copy a Goobi viewer own URL:

Image

To quote this image the following variants are available:
Here you can copy a Goobi viewer own URL:

Citation recommendation

baltsavias, emmanuel p. Fusion of Sensor Data, Knowledge Sources and Algorithms for Extraction and Classification of Topographic Objects. RICS Books, 1999.
Please check the citation before using it.

Image manipulation tools

Tools not available

Share image region

Use the mouse to select the image area you want to share.
Please select which information should be copied to the clipboard by clicking on the link:
  • Link to the viewer page with highlighted frame
  • Link to IIIF image fragment

Contact

Have you found an error? Do you have any suggestions for making our service even better or any other questions about this page? Please write to us and we'll make sure we get back to you.

Which word does not fit into the series: car green bus train:

I hereby confirm the use of my personal data within the context of the enquiry made.