Full text: Proceedings; XXI International Congress for Photogrammetry and Remote Sensing (Part B5-2)

927 
SELF-CALIBRATION OF A 3D RANGE CAMERA 
Derek D. Lichti 
Department of Geomatics Engineering, Schulich School of Engineering, The University of Calgary, 2500 University 
Dr NW, Calgary AB T2N 1N4, Canada, - ddlichti@ucalgary.ca 
Commission V, WG V/3 
KEY WORDS: Range camera, calibration, modelling, error, performance analysis, bundle adjustment 
ABSTRACT: 
This paper proposes a new, integrated method for the self-calibration of 3D laser range cameras (LRCs) and corresponding 
systematic error models. Unlike other recently-proposed methods that consider independent sub-system calibration, this method 
allows simultaneous calibration of the camera-lens and rangefinder systems. The basis of the modelling is the collinearity and range 
observation equations augmented with systematic error correction terms that are estimated in a ffee-network, self-calibrating bundle 
adjustment. Several experiments designed to test the effectiveness of different combinations of systematic error model parameters on 
a SwissRanger SR-3000 LRC are described: a highly-redundant self-calibration network; an accuracy assessment test in which 
independently-surveyed target co-ordinates are compared with those from the LRC; and measurement of a planar surface. The 
former two tests reveal that an 11-parameter physical model is needed to correct all significant systematic errors. The latter 
experiment demonstrates the need for two additional empirical error terms for correcting residual rangefinder errors. Colour- 
dependent biases in the rangefinder measurements were found to cause the range observation residuals to be undesirably inflated. 
1. INTRODUCTION 
Laser range cameras (LRCs) or range imaging cameras can 
simultaneously capture a full 3D point cloud with an array 
sensor at video rates by time-of-flight rangefinding within a 
narrow field of view. They offer great potential for real-time 
measurement of static and, perhaps more importantly, dynamic 
scenes. Their principal advantage over laser scanners is the lack 
of a scanning mechanism and over digital cameras is that only 
one sensor is needed for 3D data capture. There are already 
numerous applications of this technology that include face 
detection (Hansen et al., 2007), mobile robot search and rescue 
(Ellekilde et al., 2007), gesture recognition for human-computer 
interaction (Holte et al., 2007; Breuer et al., 2007), 
manufacturing, automated vehicle guidance, guidance for the 
blind and wheelchair assistance (Bostelman et al., 2006). Others 
include video gaming, real-time foot mapping for podiatry, 
pedestrian sensing for automobile collision avoidance and 
person tracking for airport security. 
The full metric potential of LRCs can not be realised, though, 
without a complete systematic error model and an associated 
calibration procedure to estimate all model coefficients. The 
recent research efforts of some have focused on the application 
of standard camera calibration procedures for the camera-lens 
system (Reulke, 2006; Santrac et al., 2006). Others have 
considered independent calibration of the camera-lens and 
rangefinder systems (Kahlmann et al., 2007; Lindner and Kolb, 
2006) where the latter is calibrated using a combination of 
baseline and surface fitting methods. The challenge of a 
complete system calibration has been stated by Breuer et al., 
(2007): “Comprehensive calibration turned out to be very 
difficult”. A new, integrated calibration approach that addresses 
this challenge is presented herein. Unlike the methods of others, 
the approach taken here is simultaneous calibration of both the 
rangefinder and the camera-lens systems. 
This paper is structured as follows. First, the mathematical 
models are presented. This includes the observation equations, 
the systematic error models and the calibration solution method. 
Following a description of the LRC used, three experiments are 
described: one in which the LRC is calibrated and two in which 
the efficacy of the calibration is independently assessed. Results 
from these experiments are analysed in detail with particular 
attention paid to model efficacy, solution strength as measured 
by parameter correlation and the accuracy improvement 
resulting from the calibration. 
2. MATHEMATICAL MODELS 
2.1 Observation Equations 
The basic observation equations logically stem from the fact 
that a LRC delivers radiometric intensity and 3D co-ordinates at 
each pixel location. Thus for any point i appearing in the focal 
plane of image j two collinearity equations 
u >i 
X:: = X n -C: + ÀX 
'J p i J w 
Ü 
(1) 
+Ay 
■j 
(2) 
and one range equation 
Pi, = V( X - - x tf+(r -rtf + fc- z tf + A P 
(3) 
can be written, where
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.