1 registering LiDAR
nce between image
en the application of
cerned, resolution of
from Sem to several
figuration of modern
icing of LiDAR point
What is the optimal
registered to LiDAR
view this problem as
ywledge, there is no
'gistration of LiDAR
ges are as following:
z to complete in two
rientation is used to
t clouds; second,
register with LiDAR
ition of registration
of registration method
> and multiple frame
nuously; (4) there is
RESSION OF
TIVES
tives
ning the LiDAR data
| also be decomposed
ction of registration
ity measurement, the
nd the strategy for
remote sensing data's
tribution and easy to
ifferent sensor data's
se, scene changes and
e used for registration
ch as forest, lake and
Goshtasby, 1986) and
nt of lines intersection
with large radian(Ali,
ding ground objects'
ction lines.
gistration primitives,
of discreteness, which
its. Moreover, linear
f scalability. Any two
whole linear feature;
ransformation model,
onstrained condition
ire are various, which
expressed by any two
ye expressed by line
| points from LiDAR
jle, since LiDAR data
discrete return echoes
e values. Therefore, it
DAR data and image
n primitives. Though
egistration primitives,
the extraction of them is not an easy task neither from imagery
nor from point clouds. So straight lines are chosen as the
registration primitives in our algorithm especially in urban area
22 The Extraction of Linear Primitives
In our procedure, we use accurate lines as the registration
primitives in our transformation function. Bearing this in mind
and considering the line extraction strategies mentioned above,
we propose a new algorithm which is based on the fact that
many sidewalls of buildings could reflect laser pulses, since
most of time the data acquisition scanning angle is not zero
degree. So many laser echoes are reflected from the sidewalls of
the buildings (Fig. 1) .From above, we proposed a so-called
Differential Volume Statistics Method (DVSM) method to
extract precise linear features, which makes use of the property
that the density of point clouds increases dramatically in the
vicinity of edge lines (Fig.2) (Ma, 2010) .
Figure 1: extraction of linear features. The red ones are the ideal
linear features
The details of DVSM are as following:
(1) Firstly, use the progressive TIN filtering method to
get the ground points and grid DEM;
(2) Secondly, using Gradient operator to detecte edges as
the initial Linear features from depth-image which is
generated from the LiDAR points by height ;
(3) Scanning the initial linear features, project the current
line onto the DEM, then the feature plane is formed
whose normal vector is ,as shown in figure 3.
Figure2- The side wall of the building -
A
te
Figure3- The formulation of the feature voxel
(4) The feature plane move a tiny distance dn in the
direction of the vector "? , then the feature voxel dv is
generated.
(5) Counting the point number N in dv, if N>T, turn to
(6),or else turn to (3);
(6) Sort the laser points by Z values. Define P is the point
with median Z value, then search point Q that meet
the following two conditions: (a) zZ, — Zz] «bh un (0
make sure the point Q is also in the wall area;(b)
D= x md, «|x, - y » D ,where h
mééthoit AS the
elevation difference threshold, D is the distance
threshold in the x-y plane,in order to make sure that
the point P and Q are not very close. (c)Z values of
the point P and Q can be interpolated using the
corresponding original linear feature, and turn to (3)
unless all the original linear features have been
analysised.;
(7 Output all the linear features which can be used as
registration primitives.
2.3 Registration Primitives’ Expression
si s2
A
TB
Figure 4. The line AB on the LiDAR point data and the
corresponding points in the image space
Firstly, take line AB (Figure.5) extracted from LiDAR points
data as a line in object space. As points data is three-
dimensional data, AB is three-dimensional line. P, a point in AB,
has a correspondent corresponding image point P'in image.
A , an unknown parameter, is introduced. So P’s true value
coordinate can be expressed by coordinates of A and B, as well
as 4, tobe[X ¥ Zr as shown by formula (1):
p?^p?
X, X X AM,
y [ex ay e
Z, Z Z2.
Where,
(X,,Y,Z,) is the three-dimensional coordinate of A
point in line in LiDAR space;
(X, 1.2.) is the three-dimensional coordinate of A
point in line in LiDAR space’
Since this P point should be a corresponding point P' in image
space (assumed not in the occluded area), ( Y.2) can be
considered as the corresponding point of P in LiDAR points
space. Different parameters À, correspond to a series of such
corresponding points. It can be seen that P can be expressed by
line AB and parameter A, , without seeking corresponding
points in LiDAR data, therefore, it overcomes the difficulty of
selecting tie points in LiDAR points data.
3. THE REGISTRATION MODEL BASED ON
COLLINEARITY EQUATION
LiDAR points are taken as images' object space, by collinearity
equation (Wang Zhizhuo,1979), principal point of photograph,
image point and object space point are in the same line, being
expressed to be mathematical equation shown as formula (2):
383