Full text: Proceedings of the Workshop on Mapping and Environmental Applications of GIS Data

  
discovered. While more information can be 
supplied by multisource spatial data, noise, 
redundancy and confusion may also be 
introduced. If artificial neural network paradigms 
with modular functions or competition 
mechanism could be developed, the information 
process for each data source could be 
decomposed, and the contribution of each data 
set could be separately evaluated. Therefore, the 
advantages from each data source could be 
discovered and efficiently employed to perform 
the classification. To reach this goal a 
modularized ANN (MANN) is a paradigm 
requiring further development. Based on the 
above discussions, the objectives of this research 
are to: 
1. Develop MANN to handle 
multispectral, multitemporal remote 
sensing data and spatial data from 
other sources; 
2. Evaluate the capabilities of 
MANN in multisource spatial data 
classification, and develop 
strategies for multisource spatial 
data analysis; 
3. Apply MANN to produce land 
cover classifications from 
multisource spatial data. 
MODULAR ARTIFICIAL NEURAL 
NETWORK 
In the main stream of neural network 
research, artificial neural network are viewed as 
a black box tool. Hrycej (1992) argued that the 
black box state could not be expected to persist 
if artificial neural networks grow and the 
applications become more complicated and 
difficult. Therefore, a modularization of neural 
network is desirable. Modular ANN (MANN) is 
to decompose a complex task into several 
relatively small and independent  subtasks. 
MANN consists of a group of local networks 
competing to learn different aspects of a 
problem. A gating network controls the 
competition and assign different regions of the 
data space to different local networks. Each local 
network can be an individual backpropagation 
network. The architecture of MANN, as 
illustrated in Figure 1, shows that the gating 
network and the local networks are fully 
connected to the input layer. The number of 
output PEs of the gating network is the same as 
the number of local networks. The output values 
of the gating network are normalized to sum to a 
value of 1.0. The normalized output of the gating 
network is used to weight the output vector from 
the corresponding local network. The final 
output of MANN is the sum of the weighted 
output from each local network. 
Local networks and the gating network 
are trained simultaneously. The learning rule is 
designed to encourage the competition among 
local networks. For a given input vector, the 
gating network will choose a single local 
network to handle the data. Therefore, the input 
space is partitioned into regions so that each 
local network takes responsibility for a different 
region. Training of gating and local networks is 
Global output layer 
    
Figure 1. Architecture of MANN 
with two local networks. 
achieved using backpropagation of error. In 
general, any problem that can be solved with 
BPANN can be solved at least as well by a 
MANN (NeuralWare, 1993). However, since 
there are several networks learning 
simultaneously, MANN is computationally 
intensive. NeuralWare Inc. (1993) implemented a 
modular neural network paradigm based on the 
"Adaptive Mixtures of Local Experts" as 
proposed by Jacobs ef al. (1991). According to 
94 
NeuralWa: 
combinatic 
networks. 
the gating 
the activa! 
layer ? ye 
k? local ne 
as the MA 
local netw 
MANN 
Study Are 
The 
minute que 
Mountain 
Figure 2 sl 
study area 
types of 
practices. 
spectral c 
multitempc 
informatioi 
which pos 
season, but 
  
Figui 
Multisour: 
The 
research ii 
data from 
October 6,
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.