knowledge on the surface radiance, while for the correct as-
signment of gray values we must have some approximations
of image orientation.
Assuming a smooth surface and no extreme variations in
exposure geometry, we can accept an one-to-one correspon-
dence between the object space tesselation and the image
windows. Therefore, the observation equations which were
developed in the previous sections can now be formed using
the object space as the reference template
G(X,Y) — gi(æi, vs) = e(z, y) (42)
thus transferring the matching procedure to the object space.
The geometric relationship between the image coordinates
(æ:, 9) of a point in photo i and its object space coordinates
(X, Y, Z) will be in general a seven-parameter transforma-
tion
(zi yi) 7 € (X, Y, Z) (43)
This transformation need not be the collinearity condition,
as long as the seven parameters which describe the three
translations, three rotations and one scale factor are lin-
early independent. Some of the transformation parameters
may also be kept constant during the adjustment, if a priori
information allows us to consider them known.
In order for the elevation values to be computed through
the adjustment, they have to be introduced as adjustable
quantities. This can be performed by proper selection of
the other six transformation parameters (pis. Ph) to avoid
dependencies which would lead to ill-conditioned systems.
The linearized observation equations for this case are
Il
Öz;
G(X, ys = e(z, y) gi (21, vi) + (9.5 +
1
Oy; : Oz;
* guyzi)dp +. + (9, — +
$i dpi P; (g Opi
03i... Oz;
a pi + (1, 2k
Oy;
i ——)d
Taking into account that different pixels in the image win-
dow w; correspond to different elements of the ground tes-
selation (groundels, [Helava, 1988]), we see that the dZ el-
ement of the above equation is actually a vector of nı * 133
elements. Thus, the design matrix A of the adjustment so-
lution will have the sparsity pattern shown in Fig. 3. In this
figure, the large gray blocks have dimensions (n4 - n3) x 6,
while the small black blocks indicate single entries. This
pattern corresponds to the four overlapping images of F ig.
2, without using observations in between windows. Window
w; has been projected to the object space during the radio-
metric adjustment and the observations relate the surface
patch S to the windows w;,ws and w4. A more detailed
description and in-depth analysis of this technique can be
found in [Schenk & Toth, 1992].
Conceptually, object space maíching resembles matching
with geometric constraints. Taking into account the fact
that all images are created from the same ob ject space patch,
least squares matching is enforced to produce a geometri-
cally acceptable solution. Simultaneously, we are able to
reconstruct the object space DERM. Considering that one
photo is used to create the object space patch, it is clear
that this technique is equivalent to dependent orientation.
806
Figure 3: Sparsity pattern of the design matrix for object
space matching
4. IMPLEMENTATION ISSUES
In the previous sections we presented and analyzed the for-
mulation of least squares matching using multiple images.
By introducing geometric constraints and performing match-
ing in the object space, consistent matching results can be
ensured and surface patches can be reconstructed geometri-
cally as well as radiometrically.
Approximations are obviously necessary and they can be in
the form of conjugate point image coordinates, orientation
parameters and/or the object space surface, as expressed by,
e.g., an initial DEM approximation. These approximations
can be easily obtained through an automatic stereopair ori-
entation module [Schenk et al, 1991]. Experiments have
shown that accuracies of the order of ii to + of a pixel (or
4 — 6pm in photo coordinates) are to be expected when the
technique is applied as a combination of feature-based hier-
archical matching and correlation methods with continuous
updating of the results through scale space [Stefanidis et al.,
1991].
Automatic stereopair orientation and least squares multi-
photo matching can be ideally combined in automatic aero-
triangulation of large blocks of images, the former providing
valid initial approximations and the latter, being the core
module of the procedure, performing precise point deter-
mination. This fusion of more than one module should be
expected, since initial approximations are required in aero-
triangulation. Stereopair orientation essentially performs
automatically the task of selecting conjugate image win-
dows located in the areas where conjugate points are desired,
the equivalent of the preparation phase in the conventional
aerotriangulation procedure. Using these initial matching
approximations, the images are approximately brought in
their correct relative positions in space. This can be visually
materialized for operator inspection, if desired, through the
generation and continuous updating of a photomosaic. Fig.
3 depicts a photomosaic of three images, an early product of
the automatic aerotriangulation procedure. The simultane-
ous multiphoto matching technique can also be conceptually
viewed as the digital equivalent of an n-stage comparator,
allowing for the measurement of conjugate points in more
than two images at a time. Several gross errors, associ-
ated with erroneous conjugate point identification, which
limit the accuracy of conventional analytical aerotriangula-
tion can thus be avoided, optimizing the potential accuracies
of the technique.
By using a feature-driven stereomatching method to obtain