Retrodigitalisierung Logo Full screen
  • First image
  • Previous image
  • Next image
  • Last image
  • Show double pages
Use the mouse to select the image area you want to share.
Please select which information should be copied to the clipboard by clicking on the link:
  • Link to the viewer page with highlighted frame
  • Link to IIIF image fragment

Fusion of sensor data, knowledge sources and algorithms for extraction and classification of topographic objects

Access restriction

There is no access restriction for this record.

Copyright

CC BY: Attribution 4.0 International. You can find more information here.

Bibliographic data

fullscreen: Fusion of sensor data, knowledge sources and algorithms for extraction and classification of topographic objects

Monograph

Persistent identifier:
856473650
Author:
Baltsavias, Emmanuel P.
Title:
Fusion of sensor data, knowledge sources and algorithms for extraction and classification of topographic objects
Sub title:
Joint ISPRS/EARSeL Workshop ; 3 - 4 June 1999, Valladolid, Spain
Scope:
III, 209 Seiten
Year of publication:
1999
Place of publication:
Coventry
Publisher of the original:
RICS Books
Identifier (digital):
856473650
Illustration:
Illustrationen, Diagramme, Karten
Language:
English
Usage licence:
Attribution 4.0 International (CC BY 4.0)
Publisher of the digital copy:
Technische Informationsbibliothek Hannover
Place of publication of the digital copy:
Hannover
Year of publication of the original:
2016
Document type:
Monograph
Collection:
Earth sciences

Chapter

Title:
TECHNICAL SESSION 6 INTEGRATION OF IMAGE ANALYSIS AND GIS
Document type:
Monograph
Structure type:
Chapter

Chapter

Title:
KNOWLEDGE BASED INTERPRETATION OF MULTISENSOR AND MULTITEMPORAL REMOTE SENSING IMAGES. Stefan Growe
Document type:
Monograph
Structure type:
Chapter

Contents

Table of contents

  • Fusion of sensor data, knowledge sources and algorithms for extraction and classification of topographic objects
  • Cover
  • ColorChart
  • Title page
  • CONTENTS
  • PREFACE
  • TECHNICAL SESSION 1 OVERVIEW OF IMAGE / DATA / INFORMATION FUSION AND INTEGRATION
  • DEFINITIONS AND TERMS OF REFERENCE IN DATA FUSION. L. Wald
  • TOOLS AND METHODS FOR FUSION OF IMAGES OF DIFFERENT SPATIAL RESOLUTION. C. Pohl
  • INTEGRATION OF IMAGE ANALYSIS AND GIS. Emmanuel Baltsavias, Michael Hahn,
  • TECHNICAL SESSION 2 PREREQUISITES FOR FUSION / INTEGRATION: IMAGE TO IMAGE / MAP REGISTRATION
  • GEOCODING AND COREGISTRATION OF MULTISENSOR AND MULTITEMPORAL REMOTE SENSING IMAGES. Hannes Raggam, Mathias Schardt and Heinz Gallaun
  • GEORIS : A TOOL TO OVERLAY PRECISELY DIGITAL IMAGERY. Ph.Garnesson, D.Bruckert
  • AUTOMATED PROCEDURES FOR MULTISENSOR REGISTRATION AND ORTHORECTIFICATION OF SATELLITE IMAGES. Ian Dowman and Paul Dare
  • TECHNICAL SESSION 3 OBJECT AND IMAGE CLASSIFICATION
  • LANDCOVER MAPPING BY INTERRELATED SEGMENTATION AND CLASSIFICATION OF SATELLITE IMAGES. W. Schneider, J. Steinwendner
  • INCLUSION OF MULTISPECTRAL DATA INTO OBJECT RECOGNITION. Bea Csathó , Toni Schenk, Dong-Cheon Lee and Sagi Filin
  • SCALE CHARACTERISTICS OF LOCAL AUTOCOVARIANCES FOR TEXTURE SEGMENTATION. Annett Faber, Wolfgang Förstner
  • BAYESIAN METHODS: APPLICATIONS IN INFORMATION AGGREGATION AND IMAGE DATA MINING. Mihai Datcu and Klaus Seidel
  • TECHNICAL SESSION 4 FUSION OF SENSOR-DERIVED PRODUCTS
  • AUTOMATIC CLASSIFICATION OF URBAN ENVIRONMENTS FOR DATABASE REVISION USING LIDAR AND COLOR AERIAL IMAGERY. N. Haala, V. Walter
  • STRATEGIES AND METHODS FOR THE FUSION OF DIGITAL ELEVATION MODELS FROM OPTICAL AND SAR DATA. M. Honikel
  • INTEGRATION OF DTMS USING WAVELETS. M. Hahn, F. Samadzadegan
  • ANISOTROPY INFORMATION FROM MOMS-02/PRIRODA STEREO DATASETS - AN ADDITIONAL PHYSICAL PARAMETER FOR LAND SURFACE CHARACTERISATION. Th. Schneider, I. Manakos, Peter Reinartz, R. Müller
  • TECHNICAL SESSION 5 FUSION OF VARIABLE SPATIAL / SPECTRAL RESOLUTION IMAGES
  • ADAPTIVE FUSION OF MULTISOURCE RASTER DATA APPLYING FILTER TECHNIQUES. K. Steinnocher
  • FUSION OF 18 m MOMS-2P AND 30 m LANDS AT TM MULTISPECTRAL DATA BY THE GENERALIZED LAPLACIAN PYRAMID. Bruno Aiazzi, Luciano Alparone, Stefano Baronti, Ivan Pippi
  • OPERATIONAL APPLICATIONS OF MULTI-SENSOR IMAGE FUSION. C. Pohl, H. Touron
  • TECHNICAL SESSION 6 INTEGRATION OF IMAGE ANALYSIS AND GIS
  • KNOWLEDGE BASED INTERPRETATION OF MULTISENSOR AND MULTITEMPORAL REMOTE SENSING IMAGES. Stefan Growe
  • AUTOMATIC RECONSTRUCTION OF ROOFS FROM MAPS AND ELEVATION DATA. U. Stilla, K. Jurkiewicz
  • INVESTIGATION OF SYNERGY EFFECTS BETWEEN SATELLITE IMAGERY AND DIGITAL TOPOGRAPHIC DATABASES BY USING INTEGRATED KNOWLEDGE PROCESSING. Dietmar Kunz
  • INTERACTIVE SESSION 1 IMAGE CLASSIFICATION
  • AN AUTOMATED APPROACH FOR TRAINING DATA SELECTION WITHIN AN INTEGRATED GIS AND REMOTE SENSING ENVIRONMENT FOR MONITORING TEMPORAL CHANGES. Ulrich Rhein
  • CLASSIFICATION OF SETTLEMENT STRUCTURES USING MORPHOLOGICAL AND SPECTRAL FEATURES IN FUSED HIGH RESOLUTION SATELLITE IMAGES (IRS-1C). Maik Netzband, Gotthard Meinel, Regin Lippold
  • ASSESSMENT OF NOISE VARIANCE AND INFORMATION CONTENT OF MULTI-/HYPER-SPECTRAL IMAGERY. Bruno Aiazzi, Luciano Alparone, Alessandro Barducci, Stefano Baronti, Ivan Pippi
  • COMBINING SPECTRAL AND TEXTURAL FEATURES FOR MULTISPECTRAL IMAGE CLASSIFICATION WITH ARTIFICIAL NEURAL NETWORKS. H. He , C. Collet
  • TECHNICAL SESSION 7 APPLICATIONS IN FORESTRY
  • SENSOR FUSED IMAGES FOR VISUAL INTERPRETATION OF FOREST STAND BORDERS. R. Fritz, I. Freeh, B. Koch, Chr. Ueffing
  • A LOCAL CORRELATION APPROACH FOR THE FUSION OF REMOTE SENSING DATA WITH DIFFERENT SPATIAL RESOLUTIONS IN FORESTRY APPLICATIONS. J. Hill, C. Diemer, O. Stöver, Th. Udelhoven
  • OBJECT-BASED CLASSIFICATION AND APPLICATIONS IN THE ALPINE FOREST ENVIRONMENT. R. de Kok, T. Schneider, U. Ammer
  • Author Index
  • Keyword Index
  • Cover

Full text

International Axchives of Photogrammetry and Remote Sensing, Vol. 32, Part 7-4-3 W6, Valladolid, Spain, 3^4 June, 1999 
130 
KNOWLEDGE BASED INTERPRETATION OF 
MULTISENSOR AND MULTITEMPORAL REMOTE SENSING IMAGES 
Stefan Growe 
Institute of Communication Theory and Signal Processing, University of Hannover 
Appelstrasse 9a, D-30167 Hannover, Germany 
WWW: http://www.tnt.uni-hannover.de/~growe 
E-mail: growe@tnt.uni-hannover.de 
KEY WORDS: Knowledge Based Image Interpretation, Semantic Net, Sensor Fusion, Multitemporal Image Analysis. 
ABSTRACT 
The increasing amount of remotely sensed imagery from multiple platforms requires efficient analysis techniques. The leading idea of the 
presented work is to automate the interpretation of multisensor and multitemporal remote sensing images by the use of common prior 
knowledge about landscape scenes. The presented system is able to use specific map knowledge of a geoinformation system (GIS), 
information about sensor projections and temporal changes of scene objects. The prior knowledge is represented explicitly by a semantic 
net. A common concept has been developed to distinguish within the knowledge base between the semantics of objects and their visual 
appearance in the different sensors considering the physical principle of the sensor and the material and surface properties of the objects. 
In this presentation, the basic structure of the system and its use for sensor fusion on different structural and functional levels is presented. 
Results are shown for the extraction of roads from multisensor images. The approach for the analysis of multitemporal images is 
illustrated for the interpretation of an industrial fairground. 
KURZFASSUNG 
Um die immer größer werdende Menge an Femerkundungsbildem bearbeiten zu können, werden in zunehmendem Maße effiziente Aus 
werteverfahren benötigt. Die Kemidee der vorliegenden Arbeit ist es, die Interpretation von multisensoriellen und multitemporalen Luft 
bildern durch die Nutzung von Vörwissen über die Landschaftsobjekte zu automatisieren. Das vorgestellte System ist in der Lage, spezifi 
sches Kartenwissen eines Geoinformationssystems, Informationen über Sensorabbildungen und über zeitliche Veränderungen der 
Szenenobjekte für die Auswertung zu nutzen. Das Vörwissen wird explizit in einem semantischen Netz abgelegt. Es wurde ein allgemei 
nes Konzept entwickelt, um innerhalb der Wissensbasis zwischen Objektsemantik und visueller Abbildung in den verschiedenen Senso 
ren zu unterscheiden, wobei sowohl das physikalische Prinzip des Sensors als auch die Material- und Oberflächeneigenschaften der Ob 
jekte berücksichtigt werden. In diesem Beitrag werden die Grundstruktur des Systems und dessen Nutzung für die Sensorfusion auf 
verschiedenen strukturellen und funktionalen Ebenen erläutert. Beispielhaft werden Ergebnisse für Extraktion von Straßen aus multisen 
soriellen Bildern präsentiert. Weiterhin wird ein Ansatz für die Analyse von multitemporalen Bildern vorgestellt und am Beispiel der 
Interpretation eines Messegeländes illustriert. 
1. INTRODUCTION 
The automatic extraction of objects from aerial images for map 
updating and environmental monitoring represents a major topic 
of remote sensing. However, the results of low-level image 
processing algorithms like edge detectors are in general 
incomplete, fragmented, and erroneous. To overcome these 
problems, a scene interpretation is performed which assigns an 
object semantic to the features segmented in the remote sensing 
image. Prior knowledge about the objects should be used to 
constrain the object parameters and to reduce the uncertainty of 
the interpretation. To increase or decrease the reliability of 
competing interpretations, structural relationships of the objects 
could be exploited. 
data can be accessed by computers directly and is therefore 
usable for the automatic interpretation of aerial images. 
For remote sensing, different sensors such as optical, thermal, 
and radar (SAR) have been developed which collect different 
image data of the observed scene. The wish to extract more 
information from the data than it is possible using a single sensor 
system alone raises the question of sensor fusion. Several 
parameters influence the data fusion: the different platform 
locations, the different spectral bands (optical, thermal, or 
microwave), the sensing geometry (e.g. perspective projection or 
SAR geometry), the spatial resolution, and the season at image 
acquisition. State-of-the-art-systems must be able to combine 
information from different sensors. 
A partial interpretation already exists for most landscapes: the 
map corresponding to the observed scene. Due to the growing 
availability of geographic information systems (GIS), the map 
Especially for environmental monitoring, it is necessary to 
investigate images from different acquisition times to study the 
development of the observation area. The quality of a scene
	        

Cite and reuse

Cite and reuse

Here you will find download options and citation links to the record and current image.

Monograph

METS MARC XML Dublin Core RIS Mirador ALTO TEI Full text PDF DFG-Viewer OPAC
TOC

Chapter

PDF RIS

Image

PDF ALTO TEI Full text
Download

Image fragment

Link to the viewer page with highlighted frame Link to IIIF image fragment

Citation links

Citation links

Monograph

To quote this record the following variants are available:
Here you can copy a Goobi viewer own URL:

Chapter

To quote this structural element, the following variants are available:
Here you can copy a Goobi viewer own URL:

Image

To quote this image the following variants are available:
Here you can copy a Goobi viewer own URL:

Citation recommendation

baltsavias, emmanuel p. Fusion of Sensor Data, Knowledge Sources and Algorithms for Extraction and Classification of Topographic Objects. RICS Books, 1999.
Please check the citation before using it.

Image manipulation tools

Tools not available

Share image region

Use the mouse to select the image area you want to share.
Please select which information should be copied to the clipboard by clicking on the link:
  • Link to the viewer page with highlighted frame
  • Link to IIIF image fragment

Contact

Have you found an error? Do you have any suggestions for making our service even better or any other questions about this page? Please write to us and we'll make sure we get back to you.

How many letters is "Goobi"?:

I hereby confirm the use of my personal data within the context of the enquiry made.