Full text: Proceedings, XXth congress (Part 5)

   
  
   
     
   
  
   
  
   
   
     
    
  
   
   
   
   
  
   
    
  
   
   
  
  
  
   
   
    
     
    
   
   
   
  
   
   
    
     
   
  
    
   
   
   
   
  
   
   
   
   
    
  
   
  
   
   
   
  
     
nbul 2004 
red with 
discussed 
ifferences 
reoscopic, 
d t test is 
n 
(6) 
1) » Where 
iff is the 
: standard 
] for both 
cet Set) 
otoshop). 
reoscopic 
reoscopic 
oordinate 
ely. Note 
ially, the 
ecified as 
oordinate 
types of 
onfidence 
onfidence 
»-value is 
different. 
for the 
y. 
> exhibits 
ic and 
inder the 
ement for 
between 
hat for y 
pic and 
ice under 
e statistic 
ts for the 
reoscopic 
; indicate 
rences of 
that of 
ations. In 
I! results 
rger than 
naximum 
: systems 
under the 
ely. The 
s are 1.6 
T the + 
or the v- 
nder the 
International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Vol XXXV, Part B5. Istanbul 2004 
  
  
  
  
  
  
  
  
  
  
  
  
  
    
      
Table 1. Statistics of paired t-test in pixels [probability (p- 
value)/(min difference, max difference)] 
Auto3D-Socet |Auto3D-Photoshop Socet Set - 
Set Photoshop 
x-left 3.39E-05 0.002 0.177 
(25 pm). (-0.735, 3.623) | (-1.252,2.931) (-3, 2.952) 
x-right 0.895 0.242 0.233 
(25 um) {-1.565, 1.308) | (2.762, 1.971). | C0.667, 2-333) 
y 0.001 0.709 0.004 
(25 um) | (-1.657, 1.632) | (-1.943, 1.429) (-2, 1.667) 
x-left | 4.52E-10 0.033 0.001 
(50 um) | (-0.738, 2.149) | (-1.750, 1.503) | (-2.524, 1.096) 
x-right 0.008 0.063 0.661 
(50 um) | (-1.085, 1.129) | (-1.553, 1.443) | (-1.904, 1.096) 
y | 3.79E-05 0.086 0.083 
(50 p 1) | (-0.654, 1.273) | (-0.935, 1.411) |(-1.238, 1.286) 
1-2 727 = 12 4 23 p 
M no a 8% 14% a 3% 
h 33%. 
0-1 
0-1 47% 0-1 
61% 61% 
(a) (b) (c) 
Figure 9, Distribution of differences (Auto3D-Socet Set) in 
pixels (25 um images). (a) x-left. (b) x-right. (c) y. 
2-3 -1-0 1-2 -2--1 1-2 -1-0 
| 3% diis 6% / 14% 
V? 
       
  
(a) (b) (c) 
Figure 10. Distribution of difference (Auto3D-Socet Set) in 
pixels (50 um images). (a) x-left. (b) x-right. (c) y. 
This range not only suggests not much significant difference in 
practical of autostereoscopic and stereoscopic observations, but 
also indicates the approximately close maximum errors on 
point identification between these two systems. The results 
demonstrate that the autostereoscope can be a reliable 
measurement system in contrast with the traditional eyeglasses- 
based systems, especially for handling a large volume of data 
with limited resources and with the expectation of no 
significant loss in photogrammetric accuracy. Finally, notice 
that the 3D perception among all the seven participant 
observers slightly degrades because some points are difficult to 
be digitized as a consequence of the lower color contrast or 
ambiguous definition and interpretation on their precise 
locations. Nevertheless, our measurement errors are consistent 
and follow the same or very similar distribution. 
6. CONCLUSION 
In this paper, we investigate the potential and performance of 
autostereoscopic measurement as a possible technical 
alternative in photogrammetric practice. For this objective, we 
first analyze the general 3D geometry of an autostereoscopic 
System. The analyses are devoted to the parameters of viewing 
zone, including its geometric shape, corresponding size, and the 
movement boundary of operators for photogrammetric practice. 
Because the movement boundaries are addressed within the 
optimum viewing zones for operators during photogrammetric 
practices, we also estimate the perceived depth that directly 
affects the accuracy of the autostereoscopic measurement. The 
analysis indicates that longer perceived depth provides 
observer a sharper 3D sense. Furthermore, to demonstrate the 
performance of autostereoscopic measurement, we implement 
several photogrammetric tests to compare both the stereoscopic 
and  autostereoscopic measurement with a standard 
measurement. We first introduce the measuring systems and 
our software, Auto3D, designed for DTI 3D monitor based on 
the parallax-barrier system. Finally we present statistical 
analyses for comparison. Our results show that over all more 
than 6296 of the autostereoscopic measurements are less than 
one pixel away from the popular stereoscopic measurements. 
The consistency of autostereoscopic measurements from 
different operators is better than 1 pixel for at least 60% of the 
measurements. 
7. REFERENCES 
Hattori, T. (1991). Electro optical autostereoscopic displays 
using large cylindrical lenses, SPIE Proceedings, Stereoscopic 
Displays and Applications II, Editors, John O. Merritt, Scott S. 
Fisher, Vol. 1457, 283-289, February, 25-27, San Jose, 
California. 
Jones, G., Lee, D., Holliman, N. and Ezra, D. (2001) 
Controlling perceived depth in stereoscopic images, 
Proceedings of SPIE, Vol. 4297 Stereoscopic Displays and 
Virtual Reality Systems VIII. 
Motoki, T., Isono, H. and Yuyama, I. (1995) Present Status of 
Three-Dimensional Television Research, Proceedings of IEEE, 
Vol. 83, pp. 1009-1021. 
Okoshi, T. (1976) Three Dimensional Imaging Techniques, 
Academic, New York. 
Okoshi, T. (1980) Three Dimensional Displays. Proc. IEEE 
Vol. 68, pp. 548-64. 
Pastoor, S. and Wópking, M. (1997) 3D Displays: A review of 
current technologies, Displays Vol. 17, pp. 100-110. 
Petrie, G. (2001). 3D Stereo-Viewing of Digital Imagery: Is 
Auto-Stereoscopy the Future for 3D?, Geolnformatics, Vol. 4, 
No. 10, 24-29. 
Sexton, I. (1992) Parallax Barrier Display Systems, In IEE 
Colloquium on Stereoscopic Television, volume Digest 173, pp 
5/1-5/5. 
Shan, J. Fu, C. Li, B. Bethel, J., Kretsch, J. And Mikhail. E. 
(2004)  Autostereoscopic measurement: principles and 
implementation, ASPRS Annual Conference, Denver, Colorado. 
Son, J. Y., Saveljev, V. V., Choi, Y. J., Bahn, J. E., Kim, S. K. 
and Choi, H. (2003) Parameters for designing autostereoscopic 
imaging systems based on lenticular, parallax barrier, and 
integral photography plates, Opt. Eng. Vol. 42, pp. 3326-3333. 
ACKNOWLEDGEMENT 
This work is sponsored by the National Geospatial-Intelligence 
Agency.
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.