Full text: Proceedings, XXth congress (Part 3)

ul 2004 
Lateral 
‚05 mm 
ita sets. 
ints an 
a DCS 
ustralis. 
d depth 
lifferent 
er. The 
M2 puts 
h, both 
regard 
located 
1oramic 
ired. 
X 
Ë 
cterized 
the x-y 
xis, The 
15 was 
& 2003, 
pproach 
;al plate 
the y'-Z 
d in the 
'ometry. 
International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Vol XXXV, Part B3. Istanbul 2004 
  
If the object is far from the camera the CCD is placed in the 
focal plane of the optics at x'=c (the focal length) on the x'-axis 
behind the optics (lower left coordinate system). To form an 
image, the camera is rotated around the origin of a (x,y) 
coordinate system. 
To derive the relation between object point X and a pixel in an 
image the colinearity equation can be applied. 
X-X, -4-(x-x,) (D 
x is the image coordinate, X, and x, are the projection centre for 
the object and the image space. To see this object point with a 
pixel of the CCD-line on the focal plate, the camera has to be 
rotated by an angle of x around z-axis. For the simplest case 
(yo70) the result is 
cosk -sink 0 ~C —C- COS K 
(X-X,)=2-R"{x"-x0)=2 sink cosx 0|| O0 |=X-|-c-sink 
0 0 1| z-z g2=z, 
(2) 
To derive some important parameters of the camera, a simpli- 
fied approach is used. The unknown scale factor can be calcu- 
lated from the square of the x-y components of this equation: 
= ey © (X-X,) «(Y-Y,) = 
c 
The meaning of rxy can easily be seen in Figure 4. This result is 
a consequence of the rotational symmetry. By dividing the first 
two equations and using the scale factor for the third, the 
following equations deliver an obvious result, which can be 
geometrically derived from Figure 4. 
Az! (4) 
AY 
A tank and AZen,-— 
AX © 
The image or pixel coordinates (i,j) are related to the angle x 
and the z-value. Because of the limited image field for this 
investigation, only linear effects (with respect to the rotation 
and image distortions) should be taken into account: 
old Ap e AZ (5) 
j=-——alan—-—+h j=——1), 
ôK AX Sz n 
oz pixel distance 
dk angle of one rotation step 
C focal length 
The unknown or not exactly known parameters ôK, io, c and jo 
can be derived from known marks in the image field. 
For calibration we used signalized points randomly distributed 
and in different distances from the camera. The analyzing of the 
resulting errors in the object space shows, that the approach (4) 
and (5) must be extended. Following effects should be in- 
corporated: 
- Rotation of the CCD (around x-axis) 
- Tilt of the camera (rotation around y-axis)) 
This effect can be incorporated into equation (2). The variation 
of the angel ¢ and © should be small (sing-j, cosp-1 and 
sino-o, coso-1) 
509 
: cosk sink o-sink-@-cosk || X-X, 
(xx) =2" Ro (X-X,)=2- -sink cosk oO-sink-q-sink |-| Y - Y, 
Q -0 | 7-4 
(6) 
For this special application the projection centre of the camera 
is (Xo, Yo,Z9)2(0,0,0). With a spatial resection approach, based 
on equation (6), the unknown parameter of exterior orientation 
can be derived. 
Despite the limited number of signalized points and the small 
field of view of the scene (30? x 30?) the accuracy of the 
panorama camera model is 0x3 image pixel of the camera. 
Using an improved model and the program from Schneider 
TU-Dresden an accuracy of better than one pixel can be 
achieved. 
3.3 Fusion of Panoramic and Laser Scanner Data 
Before the data of M2 and 3D-LS can be fused, the calibration 
of the 3D-LS must be checked. The test field shown in figure 4 
was used for this purpose. 3D-LS delivers 3D point clouds. The 
mean distance between points is about 2-3 mm at the wall. As 
the depth and image data do not fit to an regular grid they 
cannot be compared with rasterized image data of a 
photogrammetric survey without additional processing. 
  
* = = = s TE ww 
    
Figure 5. Laser image data 
First the 3D-LS data are triangulated and then the rastrized data 
is computed by interpolation on a regular grid (s. Figure 5). This 
procedure was carried out by the program ENVI. Now, 3D-LS 
data can be compared with data from a matrix camera. 
Applaying a bundle block adjustment on the image data of the 
martix camera delivers the interior orientation and the absolute 
coordinate system is built up by an additional 2 m reference. In 
order to compare object data the following coordinate transform 
is required: 
X, X, n, Tp, Pd X, 
Y = Y, dy Ty, I AG (6) 
A Z 0 Dn, I, TI "i 
x; are the points in the laser coordinate system and X; in the 
camera system. Xo and rj are the unknown transform 
parameter, which can be derived by a least square fit. After the 
transform the accuracy for the 3D-LS can be determined in 
horizontal direction to 0.5 mm or % pixel and in vertical 
direction to 1 mm or '^ pixel, if the photogrammetric survey is 
regarded as a reference. A tendency for outliers cannot be 
observed. 
In a further processing step laser data and panoramic data can 
be merged. As both data sets are in different coordinate systems 
first a transformation between both coordinate systems must be 
 
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.