However, there are some disadvantages to this method:
e high accuracy mechanics are very expensive, and will
remain so,
e the path of mechanical connectors and interfaces has
a complex error propagation that is difficult to cali-
brate,
e it yields the mechanical coordinate system, not the
optical coordinate system, and therefore a hand-eye
calibration is needed, and
e the principle is very sensitive to angular errors.
3.3.2 Orientation with External High-Accuracy Naviga-
tion Instruments: Theodolites or laser interferometers
can be used to measure the absolute positions of sensor
and work piece. However, there are similar problems as
in mechanical positioning systems. In addition, external
navigation instruments with sufficient accuracy in all 6
degrees of freedom are very expensive.
Figure 23: External navigation system to measure the
sensor orientation.
3.3.3 Feature-Based Registration of Point Clouds
("Navigation of Data”): Some objects have "geometric
contrast” from edges and other discontinuities which can
result in features. Features extracted in several point
clouds can be used to fit these point clouds to each
other. This method should be used with care, because
point clouds can be incomplete with respect to other
views. This can lead to matching errors.
3.3.4 Autonomous Orientation of 3-D sensors
(“Free- Flying“ Sensors): A basic principle in photo-
grammetry is the orientation of cameras by differential
measurements of several image objects. The measured
(2-D) image coordinates (e.g. of retro-reflecting target
points with ring codes) correspond to their (3-D) world
coordinates which are either known a priori or con-
strained by other measurements in a bundle adjustment.
The dependency of the visibility of more than three refer-
ence points can become a problem on large surfaces.
This can be solved by different means, for example by
applying additional targets (in every form), by feature-
based methods or by simple mechanical positioning
systems.
We have developed a new orientation principle based on
photogrammetric methods with extensions for active
optical 3-D sensors. The primary information for every
sensor position is derived from image sequences ac-
quired under coded stripe illumination and retro-target
illumination.
3-D senso
s
Q
reference _
spheres
eference targets
Figure 24: Autonomous sensor orientation, based on
natural features and additional targets.
3.4 Implemented sensor system
The flexible software system we developed for the "free-
flying" sensor integrates all standard software functions.
For a non-expert, complex operations are reduced to a
few commands or buttons.
The software can drive many types of cameras and pro-
jectors. Figure 25 shows one of the first setups of a
“free-flying“ 3-D sensor in our laboratory: a tripod
mounted system with two CCD standard cameras and a
calibrated programmable LCD projector.
Further development has to be done in the fields of
automated sensor optimization, signal evaluation and
error checking.
Figure 25: One of the first setups of a
“free-flying“ 3-D sensor in our laboratory.
344
International Archives of Photogrammetry and Remote Sensing. Vol. XXXI, Part B5. Vienna 1996
3.5 Api
For a n
specular
develope
and repr
tions [IM
Fi
f
Figure 2
views wl
was sui
rms-nois
0.025 %
Figure
cle