Full text: XVIIth ISPRS Congress (Part B3)

  
  
Stimuli Grouping 
e LÀ 
Proximity: e © 
@ © 
e o 
Similarity: € o } { 
® o 
e e 
© + 
Continuity: e 
e 9 + 
e 
9 3 e 9 
Closure: e e * 
© 
9 e 
Symmetry: e e 
€ e 
Figure 2: The Laws of perceptual grouping 
geometric relationships such as collinearity, paralle- 
lism, connectivity, and repetitive patterns in an other- 
wise randomly distributed set of image events. 
WERTHEIMER (1923) proposed one of the earliest and 
perhaps most acceptable sets of such laws, some of 
which can be roughly stated as follows (cf. Fig. 2): 
* The Law of Proximity: the stimulus elements 
which are geometrically closer tend to be percei- 
ved as one entity. 
The Law of Similarity: the stimulus elements 
which have similar properties tend to be percei- 
ved as one entity. 
The Law of Good Continuity: the stimulus 
elements tend to form a group which minimizes 
a change or discontinuity. 
e The Law of Closure: the stimulus elements 
tend to form complete figures which are a priori 
known. 
* The Law of Symmetry: the stimulus elements 
tend to form complete figures which are symme- 
trical. 
¢ The Law of Simplicity: the stimulus elements 
tend to form figures which require the least 
length for their description. 
The laws of perceptual grouping provide a very im- 
portant source of a priori knowledge to deal with 
noisy, incomplete, and fragmentary image informa- 
tion and have been therefore widely used for a variety 
of vision tasks (MEDIONI et al., 1984: MoHAN et al., 
1989; BOLDT et al., 1989; KHAN et al., 1992). 
866 
  
Figure 3: The McCulloch-Pitts Neuron 
4 Neural Network Grouping 
Humans seem to integrate the laws of perceptual 
grouping for aggregating image data in order to disco- 
ver significant image events and cues. The main que- 
stion is how to implement this ability effectively and 
to combine the results when different laws give diffe- 
rent results. So, in this section, we look at this issue 
based on neural network modeling. 
À neural network is a computational model that is a 
directed graph composed of nodes (sometimes refer- 
red to as units or neurons) and connections between 
the nodes (cf. ZEIDENBERG, 1990). With each node is 
associated a number, referred to as the node's activa- 
tion. Similarly, with each connection in the network, 
a number is also associated, called its weight. The 
three main issues in neural network research are net- 
work connection schemes, update rules, and learning 
rules. For different tasks one should use different net- 
work models. 
4.1 McCulloch-Pitts Neuron 
We begin with the McCulloch-Pitts neuron (cf. Fig. 
3) which is a basic building element of many neural 
networks. As shown in Figure 3, the activity x; of 
a neuron is the sum of inputs that arrive via weigh- 
ted pathways. The input from a particular pathway 
is an incoming signal S; multiplied by the weight wi; 
of that pathway. These weighted inputs are summed 
independently: 
zj =) Siwij +p; = S wy + pj, (1) 
2 
where 1; is a bias term, which is formally equivalent 
to the negative of a threshold of the outgoing signal 
function. The outgoing signal S; — f (2) is typically 
a nonlinear function (binary, sigmoid, or threshold-
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.