v NV.
BM Feo s P AR
7?
1
G(z,y) — oro al )
2T0?
T?
2 D r? — 20° IT
V G(z, y) D ( 210? )exp( gen)
f (2,9) = V'G(z,y) * f(z. v)
where G(z,y) is a Gaussian filter, V^G is the Laplacian of
a Gaussian (called LoG), and f(2, y) is the image gray level
function. Convolution us denoted by *, and r — (z? - y?)?
implies that the operator is rotationally symmetric. The ad-
vantage of the LoG operator is that it combines smoothing
and differentiating into one operator. Moreover, it is local-
ized in space and frequency domains. The filtered image
f'(z,y) is divided into positive and negative regions with
average frequency of V2/c. The boundaries of these regions
are the zero-crossings. Zero-crossings occur wherever the
gray levels change sharply. The degree of change can be de-
scribed by the first-derivative of the gray level function, or
the gradient of the gray levels. Zero-crossings are separated
by an average distance which is equal to the window size of
LoG operator, the diameter of positive central region of LoG
curve w = 24/20. The larger the window size, the larger
the dislocalization of detected zero-crossings from the real
boundaries.
Edges in aerial images represent object boundaries or mark-
ings (e.g. shadows). Many object boundaries correspond to
surface breaklines. The LoG operator is applied to both left
and right image to obtain the zero-crossings. Several param-
eters are chosen to control feature detection. The window
size w of LoG operator is selected according to the quality
and the scale of the images to ensure surface feature detec-
tion. In order to supress noise or less important features,
a threshold value t is chosen according to the distinctness
of the zero-crossing. The result of applying LoG operator
are two binary images. Zero-crossings as feature entities are
obtained in the left images as following:
e The location of zero-crossings is obtained by an edge
following algorithm. The connected zero-crossing
points form the zero-crossing curve as feature entity.
e Then each zero-crosssing curve is segmented using local
curvature maxima as end points of each segment.
As a result, edges are detected as individual zero-crossing
curves connected by several possible segments.
3. CORRELATION MATCHING
The flow chart of the matching scheme is shown in Fig. 1.
Like most area-based matching algorithms, epipolar geome-
try is employed to constrain the searching to one dimension
[Cho et. al. 1992]. At each level of the image pyramid, the
image patches are first enhanced since area-based matching
methods require good image quality. Next, zero-crossings
are determined in both images. For each zero-crossing point
in the left image the corresponding point on the right image
is found along the epipolar (scan) line by area correlation.
right image zero-crossings only help to define the searching
145
window for the correlation matching. The matching is per-
formed in two steps: initial point to point correlation, and
figural continuity checking acceptance criterion. During the
initial matching, points with maximum correlation values
larger than the preset threshold value are selected as matched
points. The key point here is to find a good approximation of
the search window in the right image. This is accomplished
by using the disparity constraint map at each level of the im-
age pyramid. Once matching is completed, an interpolated
disparity image is generated, providing the approximations
needed for the next level matching . In the highest level of
the image pyramid, knowledge gained from surface analysis
is also fed back to the matching process through the use of
the disparity map. After the initial matching, all matched
points must satisfy the figural continuity constraint for final
acceptance as conjugate points.
Image pyramid level 0
Epipolar Image
Preprocessing
Y
| Feature Extraction |
Y
| Initial Correlation Matching |
| Figural Continuity Constraint |
Generate Image
Disparity Map [499 Analysis
Finish Matching !
Fig. 1. Flowchart of matching scheme
3.1. Hierarchical Disparity Constraint
The search window in correlation matching is defined by two
parameters: location and size. Obviously, the window size
depends on the goodness of the approximations. We deter-
mine the window size (search range) dynamically based on
the disparity map. If the search window is close to a zero-
crossing contour detected in the right image, it is adjusted
accordingly, because this zero-crossing is likely to be the con-
jugate point.
A crucial step in any matching system is the approximation
of the matching location (center of search window). At the
top of the image pyramid we have two options. One is using
an average disparity value for all matching positions. This
average approximation is computed from the matched points
generated during automatic orientation [Schenk et al. 1992,
and Zong et al. 1991]. The second option is to convert the
;
à
|
a
u
N
E
A
B
H
B
Hm
u
i
X
|
i
H
|
1
ht
i
a
|
a
|
il
j
|
TH
|
i
ll
i
i
=
rp eme re