Full text: New perspectives to save cultural heritage

CIP A 2003 XIX th International Symposium, 30 September - 04 October, 2003, Antalya, Turkey 
distance between the perspective centre and the photographed 
object point is measured, the spatial position of each pixel 
can be computed in the ^r| C, image reference system (see 
Fig. 1 and (1) and (2) equations): 
E. 
0 = arctan— 
c 
(1) 
£ 0 = d - cosa • send 
r] a = d ■ send (2) 
C Q =d- cosa-cos0 
where r|j are the image coordinates of the current pixel of 
the image, 0 and a are the two angles that define the direction 
in the space, d is the distance value between the centre of the 
perspective and the object point, £o, r|o and Q) are the 3D 
coordinates of the object point. 
The distance values are stored in an additional matrix that has 
the same size as the RGB ones (in terms of rows and 
columns). Therefore a “solid image” consists of a 4 level 
matrix: RGB and distances d (see Fig. 2). 
Distances of the object points are obtained from a dense 3D 
model (DDEM), easily acquired by a laser scanner. To 
calculate these distances, the laser scanning and the photo 
should be taken from two points close each other, in order to 
reduce the number of pixels that are not visible from the 
scanner (hidden areas) and therefore not determined in their 
3D position. 
IHVTANCK* 12 jiiwl) * 
(4 Bytev pixcii ** 
Kt-.ll II ltvlcpixcii 
fa*l » '• . 
B! l I il HvKm<‘*v!i 
* ) INTHtiBR *2: OiHiincct Hfiax 327.21 m 
RtAl'4 r\n> iii«am.ts{‘»‘w.siv ml 
Figure 2. Structure of the solid image 
If the external orientation parameters (X 0 ,Y 0 ,Z 0 , CD, (]), k) are 
also known, it is easy to transform the 3D image co-ordinates 
into the absolute system XYZ, by a simple roto-translation. 
2.2 Image calibration 
In order to fill the matrix "d" with a correct value of the 
distance to every object point, it is necessary to calibrate the 
image. The calibration process consists of the estimation of 
the internal orientation parameters of the camera. In 
architectural surveys, often cameras are not metric. In this 
case, the lens distortion parameters have to be determined. 
This can be achieved by measuring the image coordinates for 
a sufficient number of points, the object coordinates of which 
are known (control points). The procedure can be completely 
automatic if a laser scanner is used for this purpose. 
Some reflecting targets are placed on the object. The laser 
scanner is able to measure, in addition to the 3D point 
positions, the reflectivity of the object. Special reflecting 
targets (markers) have the property to almost totally reflect 
the laser pulse, while natural points do not do the same. If the 
marker size and their mean reflectivity are known, it is easy 
to determine the marker’s position in the laser DDEM and in 
the image, in a completely automatic way. Once the marker’s 
position is defined, one can estimate the calibration 
parameters of the camera and its external orientation by using 
a classic bundle solution. 
2.3 Projection of the cloud of points 
If the internal and external image orientation parameters are 
known, it is possible to project the DDEM (“cloud of points”) 
onto the digital image. The mathematical model used for this 
operation is the central perspective model. The radial 
distortion components At, and At] the are added to the 
collinearity equations: 
£ ~£o + AÇ-c 
r l =T 1o + Arj-c 
r u {X yY 0 ) + r 2l (T y Q ) + r 3l (Z Z 0 ) 
r l3 {X-X 0 )+r 2i {Y-Y 0 ) + r i3 {Z-Z 0 ) 
r l2 {X-X 0 ) + r 22 (Y-Y 0 ) + r 32 (Z-Z 0 ) 
r l3 (X-X 0 ) + r 23 (Y-Y 0 )+r„(Z-Z 0 ) 
(3) 
The distortion components are modelled by: 
A£ = ($-0-( Vp 2 +Vp 4 +Vp 6 ) 
Ari=(ri-ri 0 )-(k x -p 2 +k 2 -p 4 +k 3 -p 6 ) 
(4) 
where‘if) is the distance from the centre of the image (radius) 
and ki, k 2 and k 3 are the radial distortion coefficients. Other 
types of distortion have not been considered. 
2.4 Interpolation of the distance matrix 
The density of the pixels in the digital image is usually 
greater than the density of the cloud of points obtained by the 
laser scanner device. For this reason, when the laser points 
are projected onto the digital image, the distance matrix is not 
completely filled in every element: the values of distance are 
associated only to some pixels. 
In order to fill the distance matrix it is necessary to integrate 
the missing values with an interpolation procedure. 
Figure 3. Interpolation of the distance matrix 
The “average weighed method” has been used. The four 
nearest pixels, which the value of distance is known, are 
considered (see Fig. 3).
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.