|^ and
pose
ts to
uring
e can
inces
n 2D
ents.
ent.
e can
man-
Ss can
ne et
three
| that
ather
met-
r and
s dis-
and a
ween
ojec-
rding
d ap-
. Till
ts for
high
ing :
vhich
sr the
leter-
n the
g the
times
esent
ise of
ction.
lane,
goth-
other
ance,
voids
S, We
oper-
| then
nents
International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Vol XXXV, Part B3. Istanbul 2004
2 DATA SETS
On the one side, we are dealing with a point cloud coming from
one station scan. Scan is performed by constant angle ray tilt-
ing in a vertical plane, followed by rotation around vertical axis.
Range measures present noise that can be reduced by multiple-
shots. Points coordinates are defined in the scanner reference
system.
On the other side, we have got a digital image and its optical
model, in an image coordinate system centered on camera point
of view. It is a conic projection where distortion is corrected.
Camera devices record 5 Mpixels color image, coded on 12 bits.
A
Figure 1: Point cloud, digital image (details).
3 SYSTEM FORMALIZATION
In this section, we develop the explicit system where orientation
and location unknowns are considered. It is a bundle adjustment
frame, where we look for translation and rotation between a ter-
rain reference system (the scanner one) and an image system.
3.1 Distance between segments.
Let us project the ends of a 3D segment into the image plane. pi
and p? are the image projections of the segments ends. Expressed
in polar coordinates, in the image coordinate system, the straight
line passing through these points is defined by :
x COS 0 + 7. sino p (1)
with :
cos 0 = (pa — pi) Q)
lip]
sin (P1 — P2)e 3)
2122|
Iprpa|
The projection of a 3D point P into image is given by :
AG — 1. RO -—T
Dr — f Ri Dy = f ( = Du (5)
R(P — T) R(P — T)
where :
e R rotation between world system and image system
e T translation between world system and image system
e fthe focal length.
Note P, P» the 3D segment ends in world coordinates. Con-
sidering a vector 7i lying into the plane which contains the 3D
1131
3D segment
Distances to minimize
Point ^: c nad
cloud: Pr À
2D matched segment — T "€ ds
Figure 2: Distance between segments.
segment and the image projection center, and orthogonal to the
image plane, such as :
ii 2 R() — T) AR(P, — T) (6)
Replacing equation (5) into relations (2), (3) and (4) gives :
Tis ; n 75;
cos = —; sin0=— p=-f—
[172] liri]
A 2D segments end named p — (1;, L,, f) lies on the projected
line ; this yields :
Bolen
|i
Left term in equation (8) is the distance between the ends of the
segment detected in images and the line supported by the 3D seg-
ment projection. This distance is associated as residual for each
segment.
0 (8)
3.2 Distance between points
On the same scheme, one can define distance between two
matched points : reaching least-square distance between the im-
age point and the 3D point projection amounts to minimize the
sum of squared distances on the X-axis and on the Y-axis.
Expected distance annulation along X-axis gives :
PT. :
R(P — T);
Along Y-axis :
x
pn RO TO d (10)
R(P — T)
3.3 Global energy function
For segments, energy function is derived from (4) :
Ny San)
= Mu - PI
E = = (11)
: (32)