Full text: XVIIIth Congress (Part B5)

ypes. As devel- 
> advantages of 
early the theo- 
achieved with 
20 shows the 
rid code which 
is code can be 
2 3) and also 
ternary, quater- 
sition compared 
jual conditions, 
the lowest ac- 
See also Figure 
  
he lower part 
code. 
n environments 
s". Because this 
| integrated self 
ation functional- 
es which can be 
ch the projector 
iscussed. 
ata to a Three- 
be one solution 
calibration algo- 
of how the 3D- 
96 
  
  
  
sensor works. Their only assumption is that the sensor 
has a mapping function Xout = F (x, Y, C)image that is 
unique and smooth. This means that the higher order 
frequency terms of the deviations from the orthogonal 
reference coordinate system are negligible, and the map- 
ping function F can be sampled and approximated by a 
rather coarse three-dimensional grid of distinct points. 
For example, the measured 3-D target positions can be 
fitted to the well known coordinates of a moved plate 
(Figure 22) by polynomial functions. 
  
  
  
  
  
  
  
  
  
  
   
  
  
  
movement 
  
calibration plate with targets 
Figure 21: Representation of a 3-D grid 
by moving a 2-D grid in normal direction. 
This calibration concept has certain limitations: 
e it needs accurate, long-term-stable three-dimensional 
reference systems (e.g. a flat calibration plate with 
targets and a mechanical positioning system with 
high accuracy [Bre95]), 
e data reconstruction requires grid interpolation, which 
is a time consuming numerical operation, and 
* Sub-pixel precision is sufficient for 3-D measurement, 
but is not suitable for photogrammetrical applica- 
tions. 
3.2.2 Estimation of Matrix Camera and Matrix Projec- 
tor Model Parameters: Photogrammetrical algorithms for 
camera calibration are based on physical camera models. 
To find the model parameters, the image coordinates of 
targets (output) are measured with very high sub-pixel 
precision and compared to the actual object coordinates 
(input). 
A light projector can be seen as an inverse camera. To 
calibrate the projector, the object coordinates (output) of 
projected targets (e.g. crossed stripes) are measured 
with other cameras and compared to the known projec- 
tor coordinates (input) (stripe numbers in x and y) 
[Str93]. Unfortunately, this concept is not suitable for 
high resolution matrix/projector 3D-sensors. Presently, 
high resolution projectors cannot send cross type pat- 
terns (e.g. orthogonal stripes) which can be measured by 
photogrammetry. 
3.2.3 Estimation of Matrix Camera and Linear Projec- 
tor Model Parameters: Basically a matrix camera needs 
a stripe pattern for triangulations. In addition most pro- 
jector technologies obtain higher resolution and signal 
quality when only one stripe direction is necessary. 
Therefor it was necessary to develop a new self calibra- 
tion technique. This new calibration method makes use 
of a simple hand-positioned calibration plate to simulta- 
neously calibrate all parts of the measurement system. 
The light projector is seen as an inverse camera with 
"long" pixels (stripes). In terms of photogrammetry, ma- 
trix/projector sensors can be seen as stereo camera 
pairs where one camera is an inverse camera. 
The new principle combines several advantages: 
e the calibration delivers inner and outer orientations of 
camera(s) and projector, 
e all calibration tasks can be done in a few minutes, 
e the calibration plate may be hand-positioned, 
e acalibration check can be done during measurement, 
e the sensor system is suitable for photogrammetric 
tasks such as autonomous sensor orientation, and 
e 3-D coordinates can be calculated fast in either a 
sensor or a world coordinate system. 
3.3 Orientation and Navigation of 3-D Sensors 
In general, 3-D objects should be measured from several 
views to get a complete representation. To find the sen- 
sor's orientation relative to the measured object in every 
single view, several principles may be used. This part 
discusses different methods for determining sensor ori- 
entation. It then discusses how information from these 
sources can be combined to extract sensor orientation 
for the new sensor system. 
3.3.1 Mechanical Positioning Systems of High Accu- 
racy: In principle, object and sensor can be "coupled" to 
each other by a mechanical positioning system that de- 
fines six degrees of freedom. 
positioning 
    
Figure 22: Mechanical system used to“define* the six 
parameters of sensor orientation. 
International Archives of Photogrammetry and Remote Sensing. Vol. XXXI, Part B5. Vienna 1996 
 
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.