Full text: Technical Commission III (B3)

International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Volume XXXIX-B3, 2012 
XXII ISPRS Congress, 25 August — 01 September 2012, Melbourne, Australia 
  
  
  
  
  
  
(b) results of endpoints matching 
  
  
(c) results of line-based matching 
Figure 8. Comparison of endpoints and line-based matchings 
3.3 Comparison of Line-based and Edge-based Matching 
We also select a linear feature to compare the results of line- 
based and edge-based multiple matching. Line-based matching 
utilized the whole line for matching while edge-based matching 
divided a line into a set of points for matching. The red points 
in Figure 9(a) indicate the edge's points while the yellow line in 
Figure 9(b) is the line for matching. The extracted points and 
lines by edge-based matching are shown as Figure 9(c). There 
are a few outliers caused by the insufficient information of 
matching window in edge-based matching. After line fitting, 
these outliers are removed. Figure 9(d) is the average NCC in 
different depths. The highest average NCC reached 0.9. Figure 
9(e) overlapped the two extracted 3D lines. Table 2 compares 
the coordinates of these two lines. The maximum difference is 
about 4cm. Then, we back project the 3D line from object space 
to image space as shown as Figure 9(f). 
Table 2. Vertices of extracted 3D line 
  
X(m) Y(m) | Z(m) 
  
Edge-based Pl 249570.52 | 2742218.16 | 101.33 
  
P2 249572.76 | 2742218.11 | 101.32 
  
  
  
  
  
  
  
  
Line-based P1 249570.56 | 2742218.20 | 101.36 
P2 249572.80 | 2742218.15 | 101.32 
Difference Pl -0.04 -0.04 -0.03 
P2 -0.04 -0.04 0 
  
  
  
  
66 
  
(a) edges | (b) line 
in emt 
^ e 
we pee 
jt 
(c) extracted points and line (d) average NCC and depth 
  
(e) 3D line extracted from edge-based and line-based matching 
(red: edge-based; green: line-based) 
  
  
(f) back-projection of extracted line. 
Figure 9. Comparison of edge-based and line-based matchings 
  
4. CONCLUSIONS AND FUTURE WORKS 
In this research, we have proposed a feasible scheme to obtain 
the 3D linear features by object-based multiple images 
matching. We have also demonstrated the orientation modeling 
by SURF and bundle adjustment. A coarse building model is 
employed to correct the tilt displacement of the façade structure. 
It is beneficial to the similarity measurement between images. 
Moreover, the multi-view images are simultaneously considered 
in similarity measurement by average NCC (AvgNCC). The 
AvgNCC is a useful index to locate the highest correlation 
among master and slave images. The targets for line matching 
can be endpoints of a line, edge's points, and line. The 
experiment indicates that line-based matching is better than 
point matching while the point is occluded by other objects. 
The future work will focus on the processing of high similarity 
repeated textures. As the tiles of the facade usually have high 
similarity, we will combine the geometric, radiometric and 
parameters constraints for multiple images matching. 
ACKNOWLEDGEMENTS 
This investigation was partially supported by the National 
Science Council of Taiwan under project number NSC 100- 
2221-E-009 -133. 
REFERENCES 
Baillard, C., Schmid, C. and Zisserman, A., Fitzgibbon, A., 
1999. Automatic Line Matching and 3D Reconstruction of 
Buildings from Multiple Views, International Archives of 
Photogrammetry and Remote Sensing, pp. 69-80. 
  
Bay 
Rot 
404 
Can 
IEE 
Inte 
Gró 
Ope 
Enc 
http 
Hat 
and 
Fea 
Eng 
Hot 
Cor 
Mc] 
Der 
Acc 
Pu, 
Rec 
Sen 
Sch 
Lau 
Zha 
Ger 
Pho
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.