Full text: XIXth congress (Part B3,2)

yd f 
S of 
line 
lan 
hat 
rey 
1, a 
ive 
the 
ng 
cal 
las 
er, 
ity 
fa 
ed 
lus 
  
Juliang Shao 
  
actually a balancing procedure based on the disparity continuity constraint in local regions (Barnard and Thompson, 
1980), which has been been adopted in numerous projects in computer vision. 
The constraints of segment matching in the case of two images are generally categorised into three: unitary, binary and 
*N-ary' (Jones, 1997). Binary constraints are the most commonly used in relaxation; such a relation can express 
neighbour compatibility, and it has been used in this way here. Fig. 6 shows the compatibility of the two segments 1 and 
2 with corresponding segments 1 and 2 having consistent disparities (ie d = d'). 
  
  
  
  
  
  
Fig. 6: Compatibility if d « d . 
4.1 Initial Compatibility Measures 
Similarity measures are fundamental processes in image correspondence determination. Depending on requirements, 
which include geometric, radiometric and scalar invariance conditions, similarity measures may be formed from one or 
more of the following: the correlation coefficient, the absolute value of grey level differences, moments and structural 
information. In this paper, the correlation coefficient, which has radiometric and partial scale invariance, is employed. 
The geometric correction is carried out by using epipolar geometry. In the cross correlation, the correlation coefficient 
is a normalised covariance function, the value of which ranges from -1 to +1. This makes the probability normalisation 
(0^1) much easier: 
P=(C+1)/2 2) 
Here, P is the initial probability and C the correlation value, which is the average of two correleation coefficients 
derived from the two pairs of end points of the corresponding line segments. To pass the grey level constraint, the 
typical threshold requirement for the correlation coefficent is that the value is positive. 
4.2 Expressing the Compatibility 
The compatibility criterion in the adopted relaxation labelling is the local similarity disparity between matched features. 
However, the similarity disparity is usually based on the assumption that only limited rotation exists between the two 
images. This assumption can be validated via the rotation of a feature point according to epipolar geometry. 
As described above, a disparity difference between two neighbouring points is usually employed for a compatibility 
check. Any imaging scale variation between the images, however, can complicate the comparison. In this paper, scale 
compensation has been considered. By assuming a scale change of not more than, say, 25%, the improved local 
disparity consistency is expressed by 
Id'-(1— s)d «t (3) 
where d and d' are coordinate shifts of two pairs of corresponding points, as illustrated in Fig. 6; s is a disparity scale 
factor between the two images; and ? is a threshold (typically 6 pixels) for the disparity difference. The coordinate shifts 
express the disparity for the two pairs; therefore d - d’ is the disparity difference in the horizontal direction. The 
measure can apply to any direction. The two neighbouring points are therefore compatible if Eq. 3 is satisfied. 
Assuming a segment in the reference image and a candidate in the object image are pre-matched, the symbolic 
expression for the probability is P. As usual, the ambiguity is removed through local consistency checks in the 
neighbourhood. These might comprise similarity disparities via rotation and scale compensation as stated above. 
Locally mutual disparity consistency reinforces the probabilities from the compatible neighbouring points: 
or= P if Point / satisfies the condition of local disparity consistency. (4) 
le R 
  
International Archives of Photogrammetry and Remote Sensing. Vol. XXXIII, Part B3. Amsterdam 2000. 841 
 
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.