corresponding to discrete values of the scale parameter. The
number of these discrete levels is user-defined, and its
selection depends on the amount of information conveyed
by the profile and the relevant storage and computational
requirements. Figure 4 (top) shows such a profile scale space
image, with the top row containing the original profile gray
values, and rows underneath that containing increasingly
smoother versions of the original signal, corresponding to
gradually larger values of the scale parameter s. Thus, such
an image will have a coordinate system (p,s,), with p being
the distance along the profile direction, and s, being the
scale parameter.
The use of digital image files (rather than simple signal
values as is the case in typical scale analysis applications)
to express the scale behavior of signals has great
advantages, as it permits us to employ digital image
analysis algorithms. To identify scale differences among
conjugate features, we can match their corresponding profile
scale space images. Matching proceeds similarly to least
squares matching, but this time a shift in the s direction
denotes a difference in scale among conjugate profiles. A
shift in the profile direction p corresponds to a refinement of
the initially available conjugate locations. By performing
this matching process along the two directions (which for
practical reasons are the base direction and its
perpendicular), we can identify the exact correspondence in
the stereopyramid
(xp, y]) > (x,.5,.5 5" y) Eq. 5
for comparing a specific feature.
This procedure can be enhanced when combined with edge
detection. Figure 4 (bottom) shows the edges in a profile
scale space image, which actually show how the various
objects intersected by this profile (variations in top row
gray values) behave in scale space. The extracted feature
outlines describe not only the behavior of a single feature,
but also its interaction with its surroundings. Robust
features are remaining evident throughout the profile’s scale
space, while ephemeral ones disappear fast. The feature to
which the given approximation belongs is the one which
surrounds the available approximation. We can easily
examine whether the given approximations lay on a robust
or ephemeral feature. Points on robust features are better
matching candidates. Furthermore, we can examine whether
the given approximations lie on the same feature by
comparing the major radiometric characteristics (absolute
gray values, gradients) of the features to which the
approximate points belong. This check can help us avoid
gross matching errors which are associated with erroneous
approximations.
Once scale space correspondences are established, assigning
to a feature at a specific scale level in a stereomate its proper
conjugate at the other stereomate’s scale space, we can
proceed with subsequent precise matching. Radiometric
parameters can be introduced in it, to fully express the
remaining radiometric differences between conjugate
patches. By taking advantage of the diffusion equation of the
Gaussian function, according to which
12
International Archives of Photogrammetry and Remote Sensing. Vol. XXXI, Part B3. Vienna 1996
og(x,s,) zd es E
os, 2°
the derivative with respect to the scale parameter is
equivalent to the second derivative with respect to the
spatial coordinate, allowing us thus to directly introduce it
in the linearized least squares matching observation
equations (with the second derivatives of gray values as
corresponding coefficients in the Jacobian matrix)
[Stefanidis, 1993].
6. EXPERIMENTS
The mathematical models and matching procedure presented
in the previous sections were tested in several experiments
using both synthetic and real images. Synthetic data were
generated by creating a DEM with substantial local
inclinations (ramps, tall buildings etc.) assigning
radiometric values to it and projecting back to fictitious
exposure stations. By varying scale space inclinations,
variations in scale differences among conjugate features were
generated. The criteria by which the performance of the
technique was judged were pull-in range in scale differences
and positional accuracy of the obtained matching results.
For ramp structures (like the one in Fig. 2) it was found that,
even with excellent approximations typical least squares
matching failed when the scale differences exceeded 20-30%.
This range of scales is due to variations in the local
radiometric content. Using the above described method we
managed to match images of the ramp which differed by
arbitrary amounts in scale. The identification of sufficient
initial correspondences between features was the only limit.
This task is indeed becoming less trivial as scale differences
increase. When certain features were significantly different
(in gray values) from their surroundings, we were even able
to identify cases of occlusions and tag them as such. In terms
of positional accuracy, our results were comparable to
typical least squares matching results (on the order of 0.1
pixel). This should be considered quite successful when
considering that these matching accuracies refer to cases
where typical matching methods failed to produce any
results. The reader is referred to [Stefanidis, 1993] for a more
detailed description and evaluation of experiments.
7. COMMENTS
The presented technique addresses the problem of matching
under the presence of extreme scale variations. The technique
proceeds by identifying and taking into account such
variations, and subsequently performing precise matching.
Considering the automation potential of matching, this
technique is viewed functioning as a module within a general
matching strategy, complementing matching results in areas
in which regular matching has failed. Of course it can
function as a stand-alone matching module, but it would be
computationally cumbersome to perform a detailed scale
space analysis for every single patch to be matched. The
developed concept of profile scale space images opens a new
direction for scale space analysis. Not only do these images
offer great visualization potential, allowing an operator to
check the process, but they also have the great advantage of
being, by des
and analysis
complete int
strategy very
edge detectior
feature tracki
image unders
C) Agouris |
Photogramme
Archives of
XXXI, Part B
C] Alvertos
Camera Geon
Vision, IEEE
Intelligence,
à
C] Babaud J.
Uniqueness of
IEEE Transa
Intelligence, °
Q Bergholm 1
Pattern Analy
pp. 726-741.
C) Burt PJ.
Processing, C
pp. 20-51.
OQ Burt P.J. (
Computation,
Analysis’ (A.
NY, pp. 6-35.
C] Chin F., A
Kernels for i
Transactions «
Vol. 14, No.
QO Gruen A.
Automatic Ex
Space Images
C) Hahn M. (
Spread Funct
Archives of F
3/2, pp. 246-
OQ Horn B.
Cambridge, N
UO Lindeberg
IEEE Transa
Intelligence,
QU Lindeberg
Analyzing Str
Statistics, Vo
U Lindeberg
Vision, Kl
Netherlands.