Full text: Proceedings, XXth congress (Part 3)

  
  
EEE 
  
OBJECT DETECTION USING NEURAL SELF-ORGANIZATION 
Arpad Barsi 
Department of Photogrammetry and Geoinformatics 
Budapest University of Technology and Economics 
H-1111 Budapest, Muegyetem rkp. 3, Hungary 
barsi@eik.bme.hu 
Commission III, Working Group III/4 
KEY WORDS: Neural networks, Object detection, Modeling, Data structure 
ABSTRACT: 
The paper presents a novel artificial neural network type, which is based on the learning rule of the Kohonen-type SOM model. The 
developed Self-Organizing Neuron Graph (SONG) has a flexible graph structure compared to the fixed SOM neuron grid and an 
appropriate training algorithm. The number and structure of the neurons express the preliminary human knowledge about the object 
to be detected, which can be checked during the computations. The inputs of the neuron graph are the coordinates of the image pixels 
derived by different image processing operators from segmentation to classification. The newly developed tool has been involved in 
several types of image analyzing tasks: from detecting building structure in high-resolution satellite image via template matching to 
the extraction of road network segments in aerial imagery. The presented results have proved that the developed neural network 
algorithm is highly capable for analyzing photogrammetric and remotely sensed data. 
1. INTRODUCTION 
Artificial neural networks have quite long history. The story has 
started with the work of W. McCulloch and W. Pitts in 1943 
(Rojas 1993). Their paper presented the first artificial 
computing model after the discovery of the biological neuron 
cell in the early years of the twentieth century. The McCulloch- 
Pitts paper was followed by the publication from F. Rosenblatt 
in 1953, in which he focused on the mathematics of the new 
discipline (Rosenblatt 1953). His perceptron model was 
extended by two famous scientists in 1969: M. Minsky and S. 
Papert. 
The year 1961 brought the description of competitive learning 
and learning matrix by K. Steinbruch (Carpenter 1989). He 
published the "winner-takes-all" rule, which is widely used also 
in modern systems. C. von der Malsburg wrote a paper about 
the biological self-organization with strong mathematical 
connections (Malsburg 1973). The most known scientist is T. 
Kohonen, who published several books on the instar and 
outstar learning methods, associative and correlation matrix 
memories, and — of course — self-organizing (feature) maps 
(SOFM or SOM) (Kohonen 1972; Kohonen 1984; Kohonen 
2001). This neuron model has great impact on the whole 
spectrum of informatics: from the linguistic applications to the 
data mining. 
The Kohonen's neuron model is commonly used in different 
classification applications, such as the unsupervised clustering 
of remotely sensed images. The paper of H.C. Sim and R.L 
Damper gives a demonstration, how the SOM model suits for 
object matching purposes with images of tools under 
translation, rotation and scale invariant circumstances (Sim 
1997). 
The goal of automatic road detection is very clear in the paper 
of R. Ruskoné et al, who apply a two-level processing 
technique in combination of road segment extraction and a 
production net (Ruskoné 1997). A. Baumgartner et al. describe 
a context based automatic technique for road extraction 
(Baumgartner 1997), while S. Hinz and his colleagues 
developed a road extractor in urban areas (Hinz 2001). The 
research of P. Doucette et al. focuses on the simulated linear 
features and the detection of paved roads in classified 
hyperspectral images HYDICE with the use of Kohonen's SOM 
method (Doucette 1999; Doucette 2001). 
2. SELF ORGANIZING NEURAL NETWORKS 
The self-organizing feature map (SOFM) or self-organizing 
map (SOM) model is based on the unsupervised learning of the 
neurons organized in a regular lattice structure. The topology of 
the lattice is triangular, rectangular or hexagonal. The 
competitive neurons have a position (weight) vector of 
dimension n: 
m-[n.n.--ul eR di 
Furthermore the input data points have similar coordinate 
vector: 
x-[5 ess E] ent e 
The learning algorithm consists of two blocks: the first is the 
rough weight modification, called ordering, the second one is 
the fine settings, called tuning. The iterative algorithm starts 
with the neuron competition: a winner neuron must be found by 
the evaluation of the following formula: 
c = arg min {|| = m, |} 
where i 21...4 € N , having q neurons, and c e N . 
   
   
   
    
   
  
   
   
   
    
  
  
  
  
  
    
   
   
    
     
     
     
   
    
    
   
     
     
  
  
  
  
  
  
  
  
    
   
     
   
  
   
   
   
    
   
Interne 
The se 
epoch 
m,(t - 
where 
as foll. 
h,() 
This f 
namel 
neight 
interp! 
rectan, 
accept 
alt). 
betwet 
As the 
model 
Was C 
quite 1 
The b 
integr: 
newly 
model 
eleme 
ensure 
learnit 
Kohor 
neighl 
The c 
adjace 
matrix 
This r 
The 1 
betwe 
order 
neighl 
imple; 
A: 
Zero e 
other 
two n 
matri 
Floyd 
The g 
learni
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.