Full text: Proceedings, XXth congress (Part 3)

anbul 2004 
‘ocess. The 
images. 
ing on the 
les in rela- 
aiming on 
tual view- 
vill be se- 
possible to 
ortions re- 
ue to pro- 
trically in 
the provi- 
ght not be 
? building 
By these 
he facades 
ent of the 
| to exist- 
aplex ob- 
m differ- 
International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Vol XXXV, Part B3. Istanbul 2004 
  
  
ent viewpoints is available. In order to connect point clouds 
from different stations, frequently tie point information is pro- 
vided using reflecting targets or spheres with defined diameter, 
which are distributed in the object area. After measurement, the 
operator manually specifies the approximate position of these 
targets. Based on this information, the target can be precisely 
localised automatically in the laser data and used as tie-point in- 
formation in the subsequent registration process. 
As it is demonstrated in Figure 10, this coregistration can 
alternatively be performed based on a ICP (iterative closest 
point) algorithm. By these means a coregistration of the laser 
data to the coarse three-dimensional model of the building is 
feasible. Thus, similar to the orientation of the collected images, 
control point information is provided from the existing building 
model in order to speed up the data processing. In principle, this 
technique can also be used for the geoeferenceing of LIDAR 
data collected from a moving platform (Früh & Zakhor 2003). 
5. CONCLUSION 
If existing urban models, which are frequently available from 
airborne data collection are used for applications like realistic 
visualisations of urban scenes from terrestrial viewpoints, a re- 
finement using terrestrial data collection is required. The 
evaluation of the terrestrial data can be simplified considerably 
by integrating these existing building models to the respective 
processing steps. Within the paper, the combined processing 
was mainly demonstrated for geoereferencing the terrestrial data 
sets by coregistration to the given models in order to provide 
imge texture. However, this combined processing is even more 
important, if terrestrial data sets are used for geometric im- 
provement of the given models. 
6. REFERENCES 
Baltsavias, E., Grün, A. & van Gool, L. [2001]. Automatic Ex- 
traction of Man-Made Objects From Aerial and Space Images 
(111). A.A. Balkema Publishers 
Bosse, M., De_Couto, D. & Teller, S. [2000]. Eyes of Argus: 
Georeferenced Imagery in Urban Environments. GPS World 
11(4), pp.20-30. 
Bohm, J. [2004]. Multi-image fusion for occlusion-free facade 
texturing. to be published in IAPRS Vol. 35 Part BS. 
Frith, C. & Zakhor, A. [2003]. Constructing 3D City Models by 
Merging Ground-Based and Airborne Views. IEEE Computer 
Graphics and Applications, Special Issue Nov/Dec 
Haala, N. & Böhm, J. [2003]. A multi-sensor system for posi- 
tioning in urban environments. ISPRS Journal of Photogram- 
metry and Remote Sensing 58(1-2), pp.31-42. 
Hoff, B. & Azuma, R. [2000]. Autocalibration of an Electronic 
Compass in an Outdoor Augmented Reality System. Proceed- 
ings of International Symposium on Augmented Reality, 
pp.159-164. 
M. Pollefeys, R. Koch, M. Vergauwen & L. Van Gool [2000]. 
Automated reconstruction of 3D scenes from sequences of im- 
ages. ISPRS Journal Of Photogrammetry And Remote Sensing 
SS(4 ), pp.251-267. 
569 
Schneider, D. & Maas, H.-G. [2003]. Geometric Modelling and 
Calibration of a High Resolution Panoramic Camera. Optical 3- 
D Measurement Techniques VI. Vol. II, pp.122-129. 
Stuttgart University [2003]. Nexus World Models for Mobile 
Context-Based Systems. http://www.nexus.uni-stuttgart.de/. 
Wolf, M. [1999]. Photogrammetric Data Capture and Calcula- 
tion for 3D City Models. Photogrammetric Week '99, pp.305 
312. 
 
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.