1sed
motion
ets are
pective
tion in
model
int was
exten-
adding
at sup-
ruction
center
ct was
sen as
fective
i|. word
about
an dis-
t is of
ith y,
as the image coordinate z is defined by the relative sensor
orientation (see figure 4):
déjdz —. 2
Figure 6: Distortion for y and w motion: The effect
of disturbance for these 2 parameters is also hard
to separate. The observable effect for w increases
with the distance from the center of the image in
y-direction (see equation (1)).
Figure 6 shows the comparison of the effect of motion distur-
bance of the parameters y and w, respectively. The possible
separation of these two parameters was explained while dis-
cussing equation (1).
In figure 7 the effect of a variable distance between the camera
and the object and a rotation of the camera around the optical
axis can be studied. Both parameters are easily separable
as they are not correlated to some other parameter. We
also observe the opposite effect of y-distortions in the left
image for changes in z-direction, and the hardly noticeable
coordinate change of « for the perpendicular line in the center
of the right triple.
Due to low resolution reproduction in this article we also
present figures 5, 6 and 7 in the web [Mar95a] at a higher
resolution for an interactive review.
The consequence of this consideration is the fact that a sepa-
ration of these correlated pairs of parameters just mentioned
still is not easy to solve. We expect to get exact information
of the lateral car motion by the use of odometric sensors (see
[MS96]) to solve the "x /¢" -pair and first experiments showed
good results. The "y/w"-pair seems to be more negligible,
as y movement of the car is not so dramatic compared to
rolling around the axis of forward motion when falling into
road holes.
4 DATA PROCESSING
4.1 General objectives
The major objective of the geometric design of the proposed
sensor is the provision of recorded data enabling subsequent
algorithms to detect motion of the sensor, reconstruct the
geometric model and derive photo texture. The first step
therefore is the creation of a normalized image.
4.2 Characteristic of motion disturbance
The amount of motion disturbance faced by a standard mid-
class car passing the front a building has to be considered.
125
Figure 7: Distortion for z and & motion: Both pa-
rameters are not correlated to others and therefore
easily separable. The z motion results in opposite ef-
fects in y-direction; The effect of & motion is hardly
noticeable in the center line of the right triple.
First experiments showed low frequency oscillations with max-
imal effects of +10cm in object space. A detailed descrip-
tion of experimental recordings can be found in [MS96] or
[MSh96].
4.3 A two step motion detection and elimination
We do not expect to solve the orientation process in one
step. Figure 8 shows the effect of motion disturbance at
(lcm)? object pixel size based on a simulation. Obviously
this distortion can not be expected to be eliminated using
standard image processing techniques.
Figure 8: Simulated effect of motion disturbance of a
scanned building facade.
Figure 9 shows the effect of motion disturbance on a line
recording in a real environment. More details can be reviewed
in [MS96].
The basic idea for the primary motion reconstruction is to
add a CCD area sensor of low resolution. A series of these
images together with possible match partners are shown in
figure 10.
The area images have an average overlap of more than 9696 in
practice. The length of the observed square in object space
has approximately the height of two levels of the building,
which in most cases guarantees the observation of at least
four windows.
A detailed description of a robust and automatic motion
detection algorithm based on standard matching and affine
transform can be found in [Mar96a] or [Mar96b].
The result is a precise method for automatic and robust dif-
International Archives of Photogrammetry and Remote Sensing. Vol. XXXI, Part B1. Vienna 1996