CMRT09: Object Extraction for 3D City Models, Road Databases and Traffic Monitoring - Concepts, Algorithms, and Evaluation
192
(Rc„+t-c M )-n M =0
=1
/?*
1 -a 3
-a,
a 2
-a,
1
(3-3)
The rotation angles (a t , a 2 , a 3 ) and the translation components
(/i, A. /3) are the six variables to be determined. Each
corresponding pair of planes E D , E M yields two linear equations
(3.3), therefore at least three pairs have to be identified in the
data to compute the rigid transformation (R,t). In general, more
correspondences can be found at urban areas. The resulting
overdetermined system can be solved approximately by
inverting the normal equations. In addition, the area of the
planar patches can be used as a weighting factor. Finally, the
corrected position of the sensor in the model coordinate system
is given as R-pops +t an ^ the orientation is corrected to R R lMV .
4. EXPERIMENTS
We tested the proposed methods on the basis of real sensor data
which were recorded 300 meters above the old town of Kiel,
Germany. Data available from four flights over this urban
terrain led to the database shown in Figure 4. Additional two
flights were considered to prove the concept of terrain based
navigation (Figure 8). For this purpose, 1000 randomly chosen
displacement vectors in the range [5 m, 20 m] were added to the
exact sensor positions and it has been checked if these offsets
are corrected automatically. Figure 10 shows the average
displacement between calculated and exact sensor position
against the number of matching pairs of planes. With our data,
we were able to reduce the average offset in sensor position to
1.5 m if at least 25 pairs of associated surfaces can be found
(standard deviation: 0.5 m). These numbers most likely depend
on additional conditions, e.g. aircraft altitude, aircraft speed,
number and orientation of facades and rooftops.
Figure 10. Average displacement against number of planes.
5. CONCLUSION AND FUTURE WORK
The examples presented in this paper were obtained with an
experimental sensor system, for which data analysis can only be
done offline to show the feasibility of the proposed approach.
Nevertheless, we guess that all computations can be
accomplished in real-time, with an efficient implementation and
appropriate hardware. In our experiments, we were able to align
the model and the ALS data such that matching objects show an
average distance of 8 cm after the registration. This absolute
exactness is not necessarily transferable to the sensor position
(see Section 4). With a larger distance between helicopter and
the terrain, impreciseness of the sensor orientation has a
considerably higher impact on the overall displacement. For
example, an angular error of 0.1 0 would lead to a shift of 1 m in
a distance of 600 m. The absolute exactness of the estimated
sensor position improves significantly when considering larger
areas and/or shorter ranges, e.g. when approaching the terrain at
low altitude. In future work, we will analyze these influences in
more detail, and we will focus on on-line change detection.
6. REFERENCES
Besl, P.J., McKay, N.D., 1992. A method for registration of 3-D shapes.
IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol.
14, No. 2, pp. 239-256.
Brenner, €., Dold, C., Ripperda, N., 2008. Coarse orientation of
terrestrial laser scans in urban environments. ISPRS Journal of
Photogrammetry and Remote Sensing 63 (1), pp. 4-18.
Filin, S., 2003. Recovery of Systematic Biases in Laser Altimetry Data
Using Natural Surfaces. Photogrammetric Engineering & Remote
Sensing 69 (11), pp. 1235-1242.
Filin, S., Pfeifer, N., 2006. Segmentation of airborne laser scanning
data using a slope adaptive neighborhood. ISPRS Journal of
Photogrammetry and Remote Sensing 60 (2), pp. 71-80.
Fischler, M.A., Bolles, R.C., 1981. Random sample consensus: a
paradigm for model fitting with applications to image analysis and
automated cartography. CACM 24 (6), pp. 381-395.
Hebei, M., Stilla, U., 2008. Pre-classification of points and
segmentation of urban objects by scan line analysis of airborne LiDAR
data. International Archives of Photogrammetry, Remote Sensing and
Spatial Information Sciences, Vol. 37, Part B3a, pp. 105-110.
Jiang, X., Bunke, H„ 1994. Fast Segmentation of Range Images into
Planar Regions by Scan Line Grouping. Machine Vision and
Applications 7 (2), pp. 115-122.
Rabbani, T., Dijkmann, S„ van den Heuvel, F., Vosselman, G., 2007.
An integrated approach for modelling and global registration of point
clouds. ISPRS Journal of Photogrammetry and Remote Sensing 61 (6),
pp. 355-370.
Rieger, P., 2008: The Vienna laser scanning survey. GEOconnexion
International Magazine, May 2008, pp. 40-41.
Schenk, T., 2001. Modeling and Analyzing Systematic Errors in
Airborne Laser Scanners. Technical Notes in Photogrammetry 19. The
Ohio State University, Columbus, USA. 42 p.
Schnabel, R., Wahl, R., Klein, R., 2006. Shape Detection in Point
Clouds. Technical report No. CG-2006-2, Universitaet Bonn, ISSN
1610-8892.
Schulz, K.R., Scherbarth, S., Fabry, U., 2002. HELLAS: Obstacle
warning system for helicopters. Laser Radar Technology and
Applications VII, Proceedings of the International Society for Optical
Engineering 4723, pp. 1-8.
Sithole, G., Vosselman, G., 2004. Experimental comparison of filter
algorithms for bare-earth extraction from airborne laser scanning point
clouds. ISPRS Journal of Photogrammetry and Remote Sensing 59 (1 -
2), pp. 85-101.
Skaloud, J., Lichti, D., 2006. Rigorous approach to bore-sight self
calibration in airborne laser scanning. ISPRS Journal of
Photogrammetry & Remote Sensing 61 (1), pp. 47-59.
Toth, C.K., Grejner-Brzezinska, D.A., Lee, Y.-J., 2008. Recovery of
sensor platform trajectory from LiDAR data using reference surfaces.
Proceedings of the 13 th FIG Symposium and the 4 ,h IAG Symposium,
Lisbon, Portugal, 10 p.
Vosselman, G., Gorte, B.G.H., Sithole, G., Rabbani, T., 2004.
Recognising structure in laser scanner point clouds. International
Archives of Photogrammetry, Remote Sensing and Spatial Information
Sciences 46 (8), pp. 33-38.
Wehr, A., Lohr, U., 1999. Airborne Laser Scanning - an Introduction
and Overview, ISPRS Journal of Photogrammetry and Remote Sensing
54, pp. 68-82.