Full text: Proceedings, XXth congress (Part 7)

004 
10St 
ign. 
'tric 
4 or 
atio 
iced 
racy 
the 
‘hen 
t be 
ocal 
Oves 
the 
|ved 
than 
pre- 
racy. 
the 
lems 
laser 
), the 
of an 
ics of 
erent, 
liodes 
point, 
ction, 
isible 
ılated, 
ject 
pects: 
n 
site 
s will 
Yf-field 
y and 
lowing 
International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Vol XXXV, Part B7. Istanbul 2004 
3.2 Spatial discrimination 
3.2.1 Laser beam propagation & resolution 
Optical laser scanners resolution is limited by the diffraction of 
the laser light. Calculating the maximum possible spatial 
resolution requires an arbitrary definition of what is meant by 
resolving two distinct features. The Rayleigh Criterion assumes 
that two points sources can be assumed as being separate 
(resolved) when the centre of the Airy Disc (imaged) from one 
overlaps the first dark ring in the diffraction pattern of the 
second. Even in the best emitting conditions (single mode), the 
laser light does not maintain collimation with distance (e.g. 
check the beam divergence on scanner specifications sheets). In 
fact, the smaller the laser beam, the larger is the divergence 
produced by diffraction. For most laser scanning imaging 
device, the 3D sampling properties can be estimated using the 
Gaussian beam propagation formula and the Rayleigh criterion. 
This is computed at a particular operating distance, wavelength 
and desired spot size within the volume. Figure 4 illustrates that 
constraint (A = 0.633 um). The solid line shows the relationship 
batween the X and Y axes (direction perpendicular to the laser 
projection) and the physical dimensions of the object to be 
scanned. A detailed analysis of this propagation property as 
applied to 3D scanners can be found in (Rioux et al., 1987; 
Beraldin et al., 1994). A number of scanner manufacturers use 
laser re-focusing techniques to achieve better resolutions at a 
cost of slowing down the effective acquisition data rate. 
10mm 
  
   
imm 
  
> pe 
$m -— 
X-Y Diffraction limited | ez eis 
[venu 
|o s ~ 
di 
^ 
4 
"  ZSpecke limited 
100 um 
  
+ 
  
  
  
  
  
  
10 pm » - 
^ 
- s 
= CE ; 
e ; NM : / “i k 
" A B c = € cj 
1 pm ass * l3 k am l UON ier v4 NE 
lem 10 cm 1m om Figure 5. Wave (undulations) phenomenon created by the motion 
Figure 4. Physical limits of 3D laser scanners as a function of 
volume measured. Solid line: X-Y spatial resolution limited by 
diffraction, Dashed line: Z uncertainty for triangulation-based 
systems limited by speckle, from Rioux 1994. 
For 2D cameras used in photogrammetry and texture mapping 
applications (see Section 4.3), one must match the sensor pixel 
size to how well an image can be resolved within an adequate 
depth of field (DOF). In these imaging applications, spatial 
resolution can be limited by diffraction. The smallest resolvable 
feature, d, for a circular aperture is given by 
d «1.22 A fn (6) 
where d — smallest resolvable feature 
À = light wavelength (e.g. 0.55 um) 
Jn = lens f-number (e.g. f/22, f/4, etc.) 
For example, at 0.55 um and for f/8, the smallest resolvable 
feature is about 5.4 pm (close to typical pixel sizes). Another 
example of interest (for display systems) shows that for the 
human eye with a pupil diameter of about 2 mm (bright room) 
977 
can resolve | arc-min or for f=20 mm, 6.7 um (matches the eye 
receptors). Finally, the DOF for an imaging system is 
approximately given by 
ry 
DOF. = an Blur (7) 
[o 
where — Z- distance lens-object 
Blur = blur spot (circle of least confusion) 
® = aperture diameter 
For example, at Z=2.5 m, f=25 mm, Blur spot = 5.5 um and 
®=1 mm (f/22), than the depth of field is about 1.4 m. Some 
camera systems use the Scheimpflug condition to extend the 
system’s DOF (see Beraldin et al., 1994 for laser scanner case). 
3.2.2 Measurement uncertainty 
As described above, diffraction limits impose a constraint on 
the resolving power along the X and Y-axes. For laser 
triangulation systems, along the range axis (Z), one could 
expect a continuous improvement as the amount of laser power 
is increased. Unfortunately, this is not the case; indeed the 
coherence of the laser light produces damaging interference 
effects known as speckle noise which limits the resolving power 
of the laser spot sensing (see Section 2.1.1). When the 
uncertainty due to speckle (8p) is projected back into the scene 
(8z — see eqn.(1)), it often means hundreds of micrometers in 
triangulation-based system (doted line in Figure 4). 
   
  
  
  
  
   
  
         
  
   
of the 3D camera wrt the scene. 
We discussed uncertainty, which represents the random part of 
the total system errors. The other part is the systematic error. 
All 3D systems exhibit this type of error to different degrees 
and for different reasons (e.g. poor calibration). Waves in the 
raw 3D images are produced when the camera or the object 
being scanned moves. This is shown in Figure 5. The waves can 
be removed by proper sensor choice (faster scanner), reducing 
motion or filtering the raw 3D images. Unfortunately, filtering 
can altar the spatial resolution. 
3.3 Objects material and surface texture effects 
It is said that with structured light (active) approaches, minimal 
operator assistance is required to generate a large quantity of 
3D coordinates, and that the 3D information becomes relatively 
insensitive to background illumination and surface texture. The 
first comment is indeed true if you compare to methods based 
on contact probes or photogrammetry. But one must be aware 
that not all the 3D information is reliable (Soucy et al., 1990; 
Paakkari, 1992; Hebert et al, 1992; El-Hakim 1994,1995:; 
Boehler et al., 2003). The latter comment about surface texture 
 
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.