The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences. Vol. XXXVII. Part Bl. Beijing 2008
The answer to the first question has been already established in
section 3.2, where it has been verified through a simulation
procedure that conjugate points in overlapping strips are related
to each other through a transformation function involving
constant shifts and a rotation angle across the flight direction.
Therefore, a six-parameter rigid-body transformation (three
shifts and three rotation angles) can be used as the
transformation function relating overlapping strips in the
presence of the bore-sighting spatial and angular biases. The
answers to the remaining questions depend on the nature of the
utilized primitives. The following subsections present the
answers to the above questions as they pertain to the selected
primitives.
4.1 Primitives Extraction and Matching
Since the LiDAR footprints are irregularly distributed, no point-
to-point correspondence can be assumed between overlapping
strips. In this regard, other primitives must be investigated. In
this work, the use of linear features derived from the
intersection of neighbouring planar patches is proposed. LiDAR
provides high redundancy in planar surfaces. Therefore, the
plane parameters can be derived with high accuracy using an
adjustment procedure (e.g. plane fitting). The larger the planar
surface, the greater will be the point cloud noise reduction.
Therefore, high accuracy linear features can be extracted by
intersecting neighboring fitted planes. To do so, an environment
for the extraction and matching of linear features in overlapping
strips was developed. The process starts by displaying the
LiDAR intensity images for overlapping strips where the
operator selects an area where linear features might exist (e.g.
roof ridge line). The user clicks on the centre of the area after
defining the radius of a circle, within which the original LiDAR
footprints will be extracted. It should be noted that the LiDAR
intensity images are only used for visualization purposes. The
user needs to establish the area of interest in one of the strips
and the corresponding areas in the other strips are automatically
defined. Figure 2a shows the specified area in one of the strips
as well as the original LiDAR footprints in that area. Then a
segmentation technique (Kim et al., 2007) is used to identify
planar patches in the point cloud within the selected area. This
segmentation procedure is independently run on the point cloud
for all the overlapping strips. The outcome from such
segmentation is aggregated sets of points representing planar
patches in the selected area (bottom right portion in Figure 2b).
For the linear features extraction, neighbouring planar patches
are identified and the plane parameters determined. Then, the
neighboring planes are intersected to produce an infinite
straight-line. Using the segmented patches, the infinite line and
a given buffer, the end points for the intersected line can be
defined (top left portion in Figure 2b). This procedure is
repeated for several areas within the overlap portion in the
involved strips.
The outcome of the extraction procedure is a set of linear
features in overlapping strips. Due to the nature of the LiDAR
data acquisition (e.g., scan angle, surface normal, surface
reflectivity, occlusions), there is no guarantee that there is one-
to-one correspondence between the extracted primitives from
overlapping strips. To solve the correspondence problem, one
has to utilize the attributes of the extracted primitives.
Conjugate linear features can be automatically matched using
the normal distance, parallelism, and the percentage of overlap
between candidate lines in overlapping strips (Figure 3). A
graphic visualization of matched linear features is presented to
the user for final confirmation of the validity of the matched
primitives.
(a) (b)
Figure 2. Area of interest selection and LiDAR point cloud
extraction (a), and extracted linear features by intersection of
segmented planar patches in the area of interest (b)
Figure 3. Matching of conjugate linear features in overlapping
strips
4.2 Similarity Measure
In this section, the similarity measure, which incorporates the
matched primitives together with the established transformation
function to mathematically describe their correspondence, is
introduced. Conjugate lines will be represented by their end
Figure 4. Underlying concept for the incorporation of linear
features in a line-based approach for the determination of the
transformation parameters.
206