| the query
metric to
oceed with
In order to
gment the
ctors, each
nage. This
igurations.
roceed by
with the
500), (900-
100-3000),
ion we can
wt of one
v using a
ly captures
ch for the
' objects is
image we
ew objects
lis sector is
in the real
bjects and
in the real
rations that
95% in our
orientation
) produce a
istance and
| metric we
h the one
position the
orithms we
odel of the
ments were
M and the
.0. In this
riments, to
ition and a
rresponding
in different
es shown in
their initial
iorama is at
rersa. In the
25 different
norama was
ribed in 3.1
ects in each
tified seven
6 right) we
e matching
6,7,8, and 9
the bottom
bregions are
3,4,5,6,7,8,9, and 10 for the top panorama, and 1,2,3,4,9, and 10
for the second panorama.
12345678 910
Top 445767911105
Bottom91195133 4 5 8
Figure 4: Number of objects in each section in the two
panorama images in figure 5
When comparing the first image and the top panorama we have
to consider 488 different configurations (of 7 objects each),
while we have to consider 410 different configurations for the
second panorama image. After applying both distance and
orientation similarity metrics with a weight of 0.75 and 0.25
respectively, the algorithm successfully identified the right
configuration, highlighted in the corresponding panorama. The
orientation angle was retrieved with an accuracy of 1 degree.
The total time required to process the distance similarity metric
for the first image against both panoramic images was 56
seconds. An additional 0.64 seconds were required for the
computation of the orientation metric for the 35 highest
candidates that survived the distance comparison. Thus, the total
time required to recover the orientation of image 1 was
approximately 57 seconds.
For the second image we have 890 different configurations (of 5
objects each) for the first panorama and 772 for the second
image. In this case the algorithm retrieves successfully the right
configuration when compared to the second panorama image,
while the right configuration was ranked 5™ among the
candidate configurations of the first panorama image. We can
see that by having a scene comprised by fewer objects we run
the risk of having numerous potential matches, increasing the
chances that we might have random configurations that
resemble closely our scene. However, configurations of fewer
points produce substantial gains in time requirements. For this
experiment the computation of the distance similarity metric for
all the possible combinations in both synthetic panoramas was
44 seconds, and an additional 0.1 second was required for the
computation of the orientation metric for the 35 highest
candidates. The accuracy of azimuth recovery was
approximately 2.5 degrees for this set-up.
5. COMMENTS
In this paper we presented a novel hierarchical approach to
recover rapidly the orientation of motion imagery in urban
areas. We proceed by comparing object configurations depicted
in this imagery to potential configurations that can be formed
using the virtual database. By considering abstract properties
such as distances between objects and their relative positions,
we produce a fast algorithm that supports the rapid recovery of
image azimuths with accuracies on the order of 1-2 degrees.
Considering that the associated computational time
requirements remain below 1 minute per frame of incoming
image, the presented approach is highly suitable for the dynamic
nature of motion imagery applications. This is an essential step
to support the use of motion imagery for the detection of
changes and the subsequent updating of virtual models of large
urban scenes.
REFERENCES
Agouris P., A. Stefanidis, and S. Gyftakis, 2001. Differential
Snakes for Change Detection in Road Segments,
Photogrammetric Engineering & Remote Sensing, 67(12), pp.
1391-1399.
Agouris P., K. Beard, G. Mountrakis, and A. Stefanidis, 2000.
Capturing and Modeling Geographic Object Change: A
SpatioTemporal Gazetteer Framework, Photogrammetric
Engineering & Remote Sensing, 66(10), pp. 1241-1250.
Antone M. and S. Teller, 2000. Automatic Recovery of Relative
Camera Rotations for Urban Scenes, Proceedings of CVPR,
2000, Volume II, pp. 282-289.
Blaser A., 2000. Sketching Spatial Queries. Ph.D. Dissertation
Thesis, University of Maine, Orono, ME.
Brenner C., 2000. Towards Fully Automatic Generation of City
Models. International Archives of Photogrammetry & Remote
Sensing, Vol. 33, Part B3/1, Amsterdam, pp. 85-92.
Coorg S. and S. Teller, 1999. Extracting Textured Vertical
Facades from Controlled Close-Range Imagery, Proceedings of
CVPR, 1999, pp. 625-632.
Day A., V. Bourdakis, and J. Robson, 1996. Living with a
Virtual City, Architectural Research Quarterly, Vol. 2, pp. 84-
9].
Egenhofer M. and R. Franzosa, 1991. Point-Set Topological
Spatial Relations. International Journal of Geographical
Information Systems, 5(2): 161-174.
Goyal R. and Egenhofer, M., 2002. Cardinal Directions
Between Extended Spatial Objects. IEEE Transactions on
Knowledge and Data Engineering (in press).
Gruen A., M. Sining, and H. Dan, 1996. 3-D City models for
CAAD-supported analysis and design of urban areas. /SPRS J.
of Photogrammetry and Remote Sensing, 51 (4), pp. 196-208.
Gruen A., and X. Wang 1999. CyberCity Modeler, a tool for
interactive 3D city model generation. Photogrammetric
Week'99, D. Fritsch and R. Spiller (Eds.), Wichmann Verlag,
Heidelberg, pp. 317-327.
Haala N. and C. Brenner, 1999. Virtual City Models from Laser
Altimeter and 2D Map Data, Photogrammetric Engineering &
Remote Sensing, 65(7), 787—795.
Jepson W., R. Liggett, and S. Friedman, 1996. Virtual Modeling
of Urban Environments, Presence, 5(1).
Kirby S., R. Flint, H. Murakami, and E. Bamford, 1997. The
Changing Role of GIS in Urban Planning: The Adelaide Model
Case Study, International Journal for Geomatics, 11(8), pp. 6-
8.
Lindeberg T., 1994. Scale-Space Theory in Computer Vision,
Kluwer Publishers.
Ranziger M. and G. Gleixner, 1997. GIS-Datasets for 3D Urban
Planning, Computers, Environments & Urban Systems, 21(2),
pp. 159-173.
Stefanidis A., P. Agouris, M. Bertoloto, J. Carswell, and H.
Georgiadis, 2002b. Scale- and Orientation-Invariant Scene
Similarity Metrics for Image Queries, International Journal of
Geographical Information Science, 16(8), in press.
—165—