The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences. Voi. XXXVII. Part Bl. Beijing 2008
411
Xj = r :j cos Pj cos a,.
y v = r ij sin Pj cos a,.
Zy = r tj sin a,
cc . P
' is the step angle of the rotating mirror of the 2D laser, J is
the step rotation angle, i and j are respectively the step number.
Actually, the portable 3D laser scanner has system errors: (1)
Installation error /. There is a translational offset / between the
center of the rotation axis and the center of the mirror wheel of
the laser scanner. (2) Range error Ap, which results from the
object’s surface features, air humidity, time-gauges built-in to
the equipment and the reflected energy, etc. (3) Scan angle
error (p. When the rotating platform begins to move, it will
spend some time to reach the constant speed, that results in that
the of actual reference direction and the axis of coordinates can
not be are not overlapping, then formula (3) should be
modified as below:
position between the 3D range and 2D image sensors sacrifices
the flexibility of 2D image capture. In fact, because of
occlusions and self occlusions, the methods above described are
not suit to the large-scale scenes. We use a hand-held digital
camera to take the images from different angles, in different
times, in different focal length. It is a technical challenge
integrating the images from freely moving cameras with 3D
models or 3D point clouds. Some related works have done by
[Stamos I., 2008, Zhao W., 2005.]. I.Stamos’s methods assume
the existence of at least two vanishing points in the scene and
register individual 2D images onto a 3D model. W. Zhao’s
methods align a point cloud computed from the video onto the
point cloud directly obtained from a 3D sensor. We use W.
Zhao’s methods to mapping images onto point clouds.
(1) Recover multi-view relations from an image sequence by
structure and motion.
(2) Compute dense depth map using multi-view stereo.
(3) Determine the camera poses by aligning 3D point clouds
from the camera and the 3D sensor using ICP (Iterative Closest
Point). Figure 4 is a result of texture mapping.
Xj- = (p + Ap) cos a, cos(/?,. + tp) + / cos(/?, + qf)
(4)
< y y = (py + Ap) cos a,, sin (ßj +tp) +1 sin (ß } + q>)
Zy =(py+Ap) since i
Here, ^ ij , a ‘, are known; ^, 1 and ^ are unknown, so
the parameters need further calibrated and corrected.
In order to calibrate the values of ap, l and <p, we select the
tablet calibration approach. We made 5 tablets, then put two of
the tablets at the perpendicular direction of x-axis (+, -) of the
actual reference coordinate system (tablet 1 and tablet 2),
another two at the perpendicular direction of y-axis (+, -) of the
actual reference coordinate system (tablet 3 and tablet 4), and
the last one at the perpendicular direction of z-axis (+) (tablet 5).
Then, the x-coordinate of the laser point on tablet 1 can be
known accurately, X=L1. In a similar way, the x-coordinate of
the laser point on tablet 2 is -L2; the y-coordinate of the laser
point on tablet 3 is L3; the y-coordinate of the laser point on
tablet 4 is -L4; the z-coordinate of the laser point on tablet 5 is
L5. LI to L5 are vertical distances between tablets and the
origin of the actual reference coordinate. During calibration, we
moved the tablets to change the values of Ln, so we acquired a
set of equation, and then we performed a compensating
computation utilizing additional parameters. After that, we used
significance test to verify the parameters’ significance, to solve
correction parameters.
4. FAST MAPPING IMAGES ONTO POINT CLOUDS
In practice, the scanned data is not continuous, although
contains continuous colour information. 2D images mapping on
3D points is satisfactory for some applications. The traditional
methods are realized by rigidly attaching a camera onto the
range scanner and thereby fixing the relative position and
orientation of the two sensors with respect to each other [Fr'uh
C., 2003, Sequeira V., 2002, Zhao H.,2003]. Fixing the relative
Figure 4 3D image of the gate of the university
Top: 3D reflectance image; Down: 3D colour image.
5. ANALYSIS OF EXPERIMENTAL RESULTS
Generally, the quality and accuracy of the recorded 3D points
of laser scanners are main parameters, which affect on the
application fields of the laser scanners. Technical data for our
laser scanner is shown as follows.
Measurement range 0.5m to 80m
Accuracy 6mm
Measurement rate up to 8000/sec
Laser wavelength near infrared
Vertical (line) scanning range 0° tol80°
Horizontal (frame) scanning range 0° to360°
Weight 6kg
We did a lot of experiments to test the portable 3D laser scanner
including data quality, optimal measurement range, influence of
surface reflectivity, environmental conditions.