Retrodigitalisierung Logo Full screen
  • First image
  • Previous image
  • Next image
  • Last image
  • Show double pages
Use the mouse to select the image area you want to share.
Please select which information should be copied to the clipboard by clicking on the link:
  • Link to the viewer page with highlighted frame
  • Link to IIIF image fragment

Fusion of sensor data, knowledge sources and algorithms for extraction and classification of topographic objects

Access restriction

There is no access restriction for this record.

Copyright

CC BY: Attribution 4.0 International. You can find more information here.

Bibliographic data

fullscreen: Fusion of sensor data, knowledge sources and algorithms for extraction and classification of topographic objects

Monograph

Persistent identifier:
856473650
Author:
Baltsavias, Emmanuel P.
Title:
Fusion of sensor data, knowledge sources and algorithms for extraction and classification of topographic objects
Sub title:
Joint ISPRS/EARSeL Workshop ; 3 - 4 June 1999, Valladolid, Spain
Scope:
III, 209 Seiten
Year of publication:
1999
Place of publication:
Coventry
Publisher of the original:
RICS Books
Identifier (digital):
856473650
Illustration:
Illustrationen, Diagramme, Karten
Language:
English
Usage licence:
Attribution 4.0 International (CC BY 4.0)
Publisher of the digital copy:
Technische Informationsbibliothek Hannover
Place of publication of the digital copy:
Hannover
Year of publication of the original:
2016
Document type:
Monograph
Collection:
Earth sciences

Chapter

Title:
TECHNICAL SESSION 6 INTEGRATION OF IMAGE ANALYSIS AND GIS
Document type:
Monograph
Structure type:
Chapter

Chapter

Title:
INVESTIGATION OF SYNERGY EFFECTS BETWEEN SATELLITE IMAGERY AND DIGITAL TOPOGRAPHIC DATABASES BY USING INTEGRATED KNOWLEDGE PROCESSING. Dietmar Kunz
Document type:
Monograph
Structure type:
Chapter

Contents

Table of contents

  • Fusion of sensor data, knowledge sources and algorithms for extraction and classification of topographic objects
  • Cover
  • ColorChart
  • Title page
  • CONTENTS
  • PREFACE
  • TECHNICAL SESSION 1 OVERVIEW OF IMAGE / DATA / INFORMATION FUSION AND INTEGRATION
  • DEFINITIONS AND TERMS OF REFERENCE IN DATA FUSION. L. Wald
  • TOOLS AND METHODS FOR FUSION OF IMAGES OF DIFFERENT SPATIAL RESOLUTION. C. Pohl
  • INTEGRATION OF IMAGE ANALYSIS AND GIS. Emmanuel Baltsavias, Michael Hahn,
  • TECHNICAL SESSION 2 PREREQUISITES FOR FUSION / INTEGRATION: IMAGE TO IMAGE / MAP REGISTRATION
  • GEOCODING AND COREGISTRATION OF MULTISENSOR AND MULTITEMPORAL REMOTE SENSING IMAGES. Hannes Raggam, Mathias Schardt and Heinz Gallaun
  • GEORIS : A TOOL TO OVERLAY PRECISELY DIGITAL IMAGERY. Ph.Garnesson, D.Bruckert
  • AUTOMATED PROCEDURES FOR MULTISENSOR REGISTRATION AND ORTHORECTIFICATION OF SATELLITE IMAGES. Ian Dowman and Paul Dare
  • TECHNICAL SESSION 3 OBJECT AND IMAGE CLASSIFICATION
  • LANDCOVER MAPPING BY INTERRELATED SEGMENTATION AND CLASSIFICATION OF SATELLITE IMAGES. W. Schneider, J. Steinwendner
  • INCLUSION OF MULTISPECTRAL DATA INTO OBJECT RECOGNITION. Bea Csathó , Toni Schenk, Dong-Cheon Lee and Sagi Filin
  • SCALE CHARACTERISTICS OF LOCAL AUTOCOVARIANCES FOR TEXTURE SEGMENTATION. Annett Faber, Wolfgang Förstner
  • BAYESIAN METHODS: APPLICATIONS IN INFORMATION AGGREGATION AND IMAGE DATA MINING. Mihai Datcu and Klaus Seidel
  • TECHNICAL SESSION 4 FUSION OF SENSOR-DERIVED PRODUCTS
  • AUTOMATIC CLASSIFICATION OF URBAN ENVIRONMENTS FOR DATABASE REVISION USING LIDAR AND COLOR AERIAL IMAGERY. N. Haala, V. Walter
  • STRATEGIES AND METHODS FOR THE FUSION OF DIGITAL ELEVATION MODELS FROM OPTICAL AND SAR DATA. M. Honikel
  • INTEGRATION OF DTMS USING WAVELETS. M. Hahn, F. Samadzadegan
  • ANISOTROPY INFORMATION FROM MOMS-02/PRIRODA STEREO DATASETS - AN ADDITIONAL PHYSICAL PARAMETER FOR LAND SURFACE CHARACTERISATION. Th. Schneider, I. Manakos, Peter Reinartz, R. Müller
  • TECHNICAL SESSION 5 FUSION OF VARIABLE SPATIAL / SPECTRAL RESOLUTION IMAGES
  • ADAPTIVE FUSION OF MULTISOURCE RASTER DATA APPLYING FILTER TECHNIQUES. K. Steinnocher
  • FUSION OF 18 m MOMS-2P AND 30 m LANDS AT TM MULTISPECTRAL DATA BY THE GENERALIZED LAPLACIAN PYRAMID. Bruno Aiazzi, Luciano Alparone, Stefano Baronti, Ivan Pippi
  • OPERATIONAL APPLICATIONS OF MULTI-SENSOR IMAGE FUSION. C. Pohl, H. Touron
  • TECHNICAL SESSION 6 INTEGRATION OF IMAGE ANALYSIS AND GIS
  • KNOWLEDGE BASED INTERPRETATION OF MULTISENSOR AND MULTITEMPORAL REMOTE SENSING IMAGES. Stefan Growe
  • AUTOMATIC RECONSTRUCTION OF ROOFS FROM MAPS AND ELEVATION DATA. U. Stilla, K. Jurkiewicz
  • INVESTIGATION OF SYNERGY EFFECTS BETWEEN SATELLITE IMAGERY AND DIGITAL TOPOGRAPHIC DATABASES BY USING INTEGRATED KNOWLEDGE PROCESSING. Dietmar Kunz
  • INTERACTIVE SESSION 1 IMAGE CLASSIFICATION
  • AN AUTOMATED APPROACH FOR TRAINING DATA SELECTION WITHIN AN INTEGRATED GIS AND REMOTE SENSING ENVIRONMENT FOR MONITORING TEMPORAL CHANGES. Ulrich Rhein
  • CLASSIFICATION OF SETTLEMENT STRUCTURES USING MORPHOLOGICAL AND SPECTRAL FEATURES IN FUSED HIGH RESOLUTION SATELLITE IMAGES (IRS-1C). Maik Netzband, Gotthard Meinel, Regin Lippold
  • ASSESSMENT OF NOISE VARIANCE AND INFORMATION CONTENT OF MULTI-/HYPER-SPECTRAL IMAGERY. Bruno Aiazzi, Luciano Alparone, Alessandro Barducci, Stefano Baronti, Ivan Pippi
  • COMBINING SPECTRAL AND TEXTURAL FEATURES FOR MULTISPECTRAL IMAGE CLASSIFICATION WITH ARTIFICIAL NEURAL NETWORKS. H. He , C. Collet
  • TECHNICAL SESSION 7 APPLICATIONS IN FORESTRY
  • SENSOR FUSED IMAGES FOR VISUAL INTERPRETATION OF FOREST STAND BORDERS. R. Fritz, I. Freeh, B. Koch, Chr. Ueffing
  • A LOCAL CORRELATION APPROACH FOR THE FUSION OF REMOTE SENSING DATA WITH DIFFERENT SPATIAL RESOLUTIONS IN FORESTRY APPLICATIONS. J. Hill, C. Diemer, O. Stöver, Th. Udelhoven
  • OBJECT-BASED CLASSIFICATION AND APPLICATIONS IN THE ALPINE FOREST ENVIRONMENT. R. de Kok, T. Schneider, U. Ammer
  • Author Index
  • Keyword Index
  • Cover

Full text

International Archives of Photogrammetry and Remote Sensing, Vol. 32, Part 7-4-3 W6, Valladolid, Spain, 3-4 June, 1999 
landuse 
poor 
landuse 
with 
are two 
of 
of an 
scene 
and 
is new. 
of this 
spectral 
time 
Therefore, a model-driven top-down approach can be 
integrated into the common data-driven bottom-up process of 
satellite image analysis. Figure 1 shows the flowchart of the 
analysis process. Common satellite image analysis 
techniques are restricted to pixel-based classification and 
involve the use of only one feature (spectral signature) and 
interactive selection of training areas. With this new 
approach these restrictions are avoided. 
Image and topographic databases give a description of the 
scene from their respective perspectives. By using the a 
priori semantic information of the topographic objects in the 
map, an automated selection of training areas is performed. 
This is done by overlaying the image with the topographic 
objects. The geometric errors between the image and map 
objects are eliminated through the large amount of training 
areas and the use of a histogram analysis. The learning of 
typical features for the object classes is necessary for the later 
step of classification. 
In the following step involving the semantic modelling of the 
topographic objects, both symbolic scene descriptions are 
linked and an unambiguous scene description with disjoint 
objects is built. Knowledge based techniques are applied for 
the classification of the resulting geometric disjoint objects. 
The result of the classification process is a complete semantic 
scene description. 
This paper does not deal with the last step of updating the 
digital database, which involves its comparison with the 
semantic scene description. 
2. KNOWLEDGE BASED FEATURE EXTRACTION 
AND SEGMENTATION 
The common features in satellite image analysis, i.e. the 
spectral signatures (mean values, standard deviations), have 
been proven to be insufficient for high quality results (Bähr 
and Vögtle, 1991; Vögtle and Schilling, 1995). Therefore, 
these features have to be extended to spectral as well as non 
spectral parameters, which can contribute to an improved 
distinction between the defined object classes. Thus, 
geometrical and structural features are taken into account 
(Table 1): 
Spectral features 
Spectral Signature 
Texture 
Non-spectral features 
Structure 
Size 
Shape/Contour 
Neighbourhood Relations 
Table 1. Selected features for image analysis. 
The automated extraction of the above defined features is 
based on the a priori knowledge represented in the 
topographic database ATKIS-DLM200, which offers both a 
(possibly not up-to-date) geometric and semantic description 
of those landuse objects to be extracted from satellite images. 
In contrast to the commonly used method, where a human 
operator interactively has to define some representative 
training areas based on his experience and intuition, now all 
DLM-objects of the same class within the geocoded satellite 
image can be used as training areas without human 
interaction. Therefore, a very large sample is taken and a 
robust estimation of the defined features is performed to 
exclude disturbances caused by errors in the topographic 
database or in the image information, e.g. out-of-date status 
of some polygons (contour lines), digitizing errors or errors 
in the geometric correction of the satellite image. For a 
robust estimation, it is assumed that for each class in the 
image at least more than 50% of the underlying DLM object 
area belongs to the DLM class, a condition which is fulfilled 
in most cases. 
The feature extraction process in this project contains a 
hierarchical concept. The spectral characteristics of the 
objects is still one of the most important features in satellite 
image information. To get a robust estimation of the spectral 
signatures of each object class, only the representative 
reflectance values for this class are extracted. For relatively 
homogeneous objects, like 'water' or 'forest', statistical 
methods have been proven to be sufficient, e.g. histogram 
analysis (extraction of the standard deviation) or a median 
estimation. Inhomogeneous objects, like 'settlement areas', 
can not be treated in this way. Typically, these areas contain 
a strong mixture of different (sub-)objects (man-made 
objects, meadows, gardens, trees, water areas etc.), and 
therefore, a wide range of reflectance values. Nevertheless, 
the accumulation of vegetation-free pixels caused by man 
made objects (e.g. buildings and traffic areas) can be seen as 
representative for 'settlement'. With respect to this 
knowledge, vegetation-free pixels can be extracted, by means 
of the NDVI (Normalized Difference Vegetation Index): 
NDVI = 
(IR-R) 
(ir + r) 
IR ...reflectance values in near infrared 
R ...reflectance values in visible red 
In Fig. 2, the NDVI for different topographic classes is 
shown. 
Fig. 2. Normalized Difference Vegetation Index (NDVI).
	        

Cite and reuse

Cite and reuse

Here you will find download options and citation links to the record and current image.

Monograph

METS MARC XML Dublin Core RIS Mirador ALTO TEI Full text PDF DFG-Viewer OPAC
TOC

Chapter

PDF RIS

Image

PDF ALTO TEI Full text
Download

Image fragment

Link to the viewer page with highlighted frame Link to IIIF image fragment

Citation links

Citation links

Monograph

To quote this record the following variants are available:
Here you can copy a Goobi viewer own URL:

Chapter

To quote this structural element, the following variants are available:
Here you can copy a Goobi viewer own URL:

Image

To quote this image the following variants are available:
Here you can copy a Goobi viewer own URL:

Citation recommendation

baltsavias, emmanuel p. Fusion of Sensor Data, Knowledge Sources and Algorithms for Extraction and Classification of Topographic Objects. RICS Books, 1999.
Please check the citation before using it.

Image manipulation tools

Tools not available

Share image region

Use the mouse to select the image area you want to share.
Please select which information should be copied to the clipboard by clicking on the link:
  • Link to the viewer page with highlighted frame
  • Link to IIIF image fragment

Contact

Have you found an error? Do you have any suggestions for making our service even better or any other questions about this page? Please write to us and we'll make sure we get back to you.

What is the first letter of the word "tree"?:

I hereby confirm the use of my personal data within the context of the enquiry made.