Full text: Proceedings, XXth congress (Part 5)

  
    
  
  
  
  
   
  
   
   
   
  
   
  
   
   
  
   
  
   
   
   
    
   
    
  
  
   
   
   
   
   
   
  
    
    
   
    
    
  
   
  
   
    
   
  
   
   
    
   
   
   
   
    
   
   
   
  
   
  
  
   
  
   
  
  
    
  
International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Vol XXXV, Part B5. Istanbul 2004 
  
4. EXPERIMENTS 
In order to test our approach we contacted a series of 
experiments. In this setup we only used just a plane object in 
our image. We created a simulated dataset for 16 different 
sensor positions. For each position we have the coordinates in 
real space and the supposed rotation angles. In figure 11 we can 
see a top view of the setup and how an image is viewed in 
station 7, while in table | we can see the orientation values for 
each station in the local coordinate system. In this setup we 
only processed the bold facade of the building. 
  
1 
Led 
Station 1*7 
Station2 *^ 
Station 3 =~ 
Station 4 = 
Station 5 x 
Station 6 «^ 
Ci 
Station 7 
Stations ”  , 
Station 9 " 
Station 10 F ; 
Station 11 of l 
Station 12 i à | 
Station 13 Station 16 
Station 14 en 
Station 15 
Figure 11 Simulation dataset setup. 
Yo Q © K 
Xo (m) Zo (m) (m) deg deg deg 
Station 1 -83.8507 -29.5928 4 -75 5 30 
Station 2 -81.2074 -37.9129 5 -70 10 25 
Station 3 -77.8689 -45.9727 6  -65 15 20 
Station 4 -73.8407 -53.7109 7  -60 20 15 
Station 5 -69.1533 -61.0685 S. 285 25 10 
Station 6 -63.8426 -67.9896 6 -50 20 5 
Station 7 -57.9488 -74.4216 5 -45 15 0 
Station 8 -51.5169 -80.3153 4  -40 10 -5 
Station 9 -44.5958 -85.6261 34. 133 5 -10 
Station 10 -37.2381 -90.3134 2.730 0 -IS 
Station 11 -29.5 -94.3417 |] 23 -5 -20 
Station 12 -21.4402 -97.6801 0 20 —10 — 25 
Station 13 -13.12 -100.3035 o 18-45 30 
Station 14 -4.603 -102.1916 -2 -10 -20 15 
Station 15 4.013 -103.3303 -1 -5  -25 45 
Station 16 12.7619 -103.7109 0 0 0 0 
Table 1 Station Orientation Information 
In Table 2 we can se he results for the recovery of the rotation 
angles using the vanishing points approach. we can see that the 
angles o, x were recovered very accurate while for the rotation 
¢ the highest error in the recovered accuracy was around 2 
degrees, which is accurate enough for our applications. We also 
run the full algorithm in a different dataset created using the 
same stations but only ¢, and x rotation angles. The results are 
presented in Table 3. We can see that for a total traveled 
distance of 130 meters the errors are in the neighborhood of 
centimeters. 
Errors 
q deg w deg K deg 
Station | -0.054689 0.000071 -0.000001 
Station 2 -0.283546 -0.000084 -0.000002 
Station 3 -0.769276 -0.000021 0.000000 
Station 4 -1.566688 0.000032 0.000005 
Station 5 -2.689448 -0.000085 0.000001 
Station 6 -1.763289 -0.000073 0.000003 
Station 7 -0.992982 -0.000091 -0.000002 
Station 8 -0.431297 0.000083 -0.000001 
Station 9 -0.102587 -0.000026 -0.000003 
Station 10 -0.000005 -0.000054 0.000000 
Station 11 -0.083557 -0.000070 -0.000002 
Station 12 -0.280196 -0.000008 0.000001 
Station 13 -0.489229 -0.000007 -0.000002 
Station 14 -0.591965 -0.000010 0.000002 
Station 15 -0.466409 -0.000102 -0.000002 
Station 16 0.000000 0.000000 0.000000 
Std 0.731904 0.000055 0.000002 
Table 2 Accuracy in rotation recovery 
Errors X m Errors Z m Errors Y m 
Station15 0.049373 0.000000 -0.000012 
Station 14 0.095555 0.000000 -0.000401 
Station 13 0.740550 -1.163700 0.067356 
Station12 0.080210 -0.000100 -0.014127 
Station! 1 0.106520 -0.000100 -0.016784 
Station10 0.123070 -0.000100 -0.021921 
Station9 0.129600 -0.000200 -0.027239 
Station8 0.126160 -0.000200 -0.029113 
Station7 0.113720 -0.000300 -0.030154 
Station6 0.093610 -0.000300 -0.030232 
Station5 0.067760 -0.000400 -0.029265 
Station4 0.038540 -0.000500 -0.027232 
Station3 0.008150 -0.000500 -0.024185 
Station2 -0.020780 -0.000600 -0.030226 
Stationl -0.046070 -0.000700 -0.036643 
Std 0.175498 0.290207 0.024736 
Table 3 Accuracy in Position recovery 
5. FUTURE WORK 
In this paper we presented a method for the recovery of 
orientation between two consecutive frames using a method that 
first determines the rotation angles and proceeds to determine 
the translation between the two frames. Using the presented 
approach we achieved very good results for the recovery of the 
rotation angles but we used very accurate measurements in the 
image points. We also achieved very good results in position 
recovery, but we took into account only two of the three 
rotation angles in the creation of our dataset. Another aspect of 
the created dataset is that we only used one object (building 
facade) in our approach. We are planning to further examine the 
behavior of algorithm by creating datasets with different cases 
of pathways. We will also like to examine how the algorithm 
works with the addition of noise in image measurements, so we 
will introduce noisy images and different kind of lens distortion 
in our model, and the interior orientation parameters. Finally we 
would like to explore the behavior of the algorithm when 
   
Internationc 
MELLE ICH C 
multiple ol 
combinatior 
affect the ac 
Chia K., A. 
Reality Re 
Symposium 
September : 
313. 
Georgiadis 
Recovery o 
Internationa 
34 (5), Corf 
Petsa, E., 
straight lin 
Internationa 
30(5):310-3 
Simon G., 
Structures. | 
pp. 46-53. 
Stefanidis / 
Georgiadis, 
Similarity 
Geographic 
Stefanidis A 
Urban Imag 
Videometric 
CA, pp. 176 
This work 
through gra 
Intelligence 
would also | 
us with the «
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.