Full text: XVIIIth Congress (Part B2)

uracy 
rics II, 
target 
E vol. 
search 
metry 
on lll 
HPs 
eismic 
Optical 
1995, 
.. John 
system 
rics Il, 
vision. 
raphics 
Recent 
jree of 
a 20 ft 
. 1820, 
ational 
of the 
MEGA 
A MODULAR NEURAL ARCHITECTURE FOR IMAGE CLASSIFICATION 
USING KOHONEN FEATURE EXTRACTION 
Marcio L. Gongalves' 
Marcio L. de Andrade Netto? 
Jurandir Zullo Jánior? 
' DCA/FEE/UNICAMP - Cidade Universitaria “Zeferino Vaz” C.P. 6101 13081-970 Campinas - SP - Brasil 
Email: mleandro@dca.fee.unicamp.br 
? DCA/FEE/UNICAMP Email: marcio@dca.fee.unicamp.br 
* CEPAGRI/UNICAMP Email: jurandir ? cpa.unicamp.br 
Commission II, Working Group 3 
KEY WORDS: Classification, Extraction, Algorithms, Neural, Performance, Image, Multispectral 
ABSTRACT 
This work presents an architecture for Remote Sensing (RS) multispectral image classification based on Artificial Neural Networks 
(ANN), aiming at two objectives, namely: searching of techniques for improving the performance in the classification task and to 
exploit the advantages of unsupervised learning for feature extraction. The architecture is divided in two modules: feature 
extraction by the Kohonen Self -Organizing Map (SOM) and classification by a Multilayer Perceptron (MLP) network, trained by 
a learning algorithm which uses 2nd-order information exactly calculated. To evaluate the efficiency of this classification scheme, 
a comparative analysis with the maximum likelihood algorithm, conventionally used for RS multispectral images classification, is 
realized. 
KURZFASSUNG 
Diese Arbeit legt ein Schema für die Klassifikation von mehrspektralen Bilder aus Fernerkundung vor, auf der Basis der 
künstlichen Neuronalen Netze. Die Ziele waren folgende: Untersuchung von Methoden zur Leistungserhóhung der Klassifizierung; 
und die Ausnutzung der Vorteile des unüberwachten Lernen für die Merkmalextraktion. Das Schema ist in zwei Phasen unterteilt: 
Merkmalextraktion durch die selbstorganisierende Karte Kohonens und Klassifikation durch das Multilayer Perceptron Netz mit 
einem Lernverfahren, das exakt berechnete Information 2.Ordnungs nutzt. Um die Effizienz des vorgeschlagenen 
Klassifikationschema zu bewerten, wurde eine Vergleichung mit dem statistischen Maximum Likelihood Algorithmus 
durchgeführt. 
1. INTRODUCTION To improve the performance of ANN techniques this work 
investigates the following approaches: parallel implementation 
Since the resurgence of interest in the middle eighties, of the training algorithms in a multiprocessing environment 
Artificial Neural Networks (ANN) have shown its efficacy and and utilization of an advanced training algorithm. 
versatility in a wide range of applications. In particular, 
successful applications in Remote Sensing (RS) image Besides improving the performance in the classification task by 
classification have already been reported (Benediktsson et al. these approaches, another objective to be searched in this work 
1990, Hepner et al. 1990, Kanellopoulos et al. 1992, Schliinzen is to exploit the ANN potential for the task of unsupervised 
1993), showing superior results to those of conventional feature extraction. 
statistical classification approaches. 
In this way a modular neural architecture (fig.1) was proposed, 
Among the advantages of ANN over conventional statistical dividing the classification problem in two phases: a module for 
methods one can point out the nonnecessity of a priori feature extraction from the RS image by Kohonen’s Self- 
knowledge of the probabilistic data model, since ANN have the Organizing Map (SOM) and a module for classification, using 
ability to learn the data distribution properties during the a Multi-Layer Perceptron (MLP) network, where the above 
training phase (Benediktsson et al. 1990), as well as the ability ideas were tested. 
to generalize and to incorporate nonstatistical information and 
knowledge that may be potentially valuable. 
  
  
  
Feature 
ta: Sin original : ificati ifi 
The most relevant restrictions on a broader utilization of ANN ret s | »| Extraction |, Classification classified 
refer basically to its present performance limitations, due to the 8 (SOM) (MLP) image 
  
  
  
  
  
  
  
  
  
  
  
slow convergence of standard backpropagation training 
algorithm (Rumelhart et al. 1986), that is normally used, and to Figure 1: Neural architecture for classification. 
the amount of adjustments to the training parameters (Key et 
al. 1989, Benediktsson et al. 1990, Hepner et al. 1990, Liu et 
al. 1991, Kanellopoulos et al. 1992, Schlünzen 1993). 
117 
International Archives of Photogrammetry and Remote Sensing. Vol. XXXI, Part B2. Vienna 1996 
 
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.