transformation are depicted in Figure 1, Schwarz et-al
(1993).
The mathematical model corresponding to this figure is
Ar"() - TG) Fam (D pb G)
where
Ar" is the position vector of an image object in the
chosen mapping frame;
r™ is the coordinate vector from the origin of the
mapping frame to the centre of the position sensor
on the airplane, given in the m-frame;
m
Ry is the three-dimensional transformation matrix which
rotates the aircraft body frame into the mapping
frame (roll, pitch, and yaw are measured by the INS);
S is a scale factor derived from the height of the sensor
above ground;
p^ is the vector of image coordinates given in the b-
frame.
Zb
Yb
P) V m...
Flight Path uet
p b Uncorrected
image vector
\
\sppP
\
\
ud Sr \ Ellipsoid
eq d Uncorrected *
position position
m
Figure 1: Georeferencing of Airborne Sensing
Data
Equation (3) is, however, only a first approximation of the
actual situation. The three sensors for positioning, attitude
determination, and imaging are physically separated, and it
can therefore not be assumed that they function in the same
measurement frame. The actual situation is shown in Figure 2
which enlarges the area around point P(t) in Figure 1. It has
been assumed that the remote sensor, for example a
photogrammetric camera, is mounted in the stable area of the
airplane, that the positioning sensor, a GPS antenna is
mounted on top of the airplane, and that the attitude sensor,
an inertial measuring unit is mounted in the interior of the
aircraft, somewhere close to the remote sensor. In this case,
aircraft position is defined by the antenna centre of the GPS
receiver (m-frame) and aircraft attitude is given by the
internal axes of the inertial measuring unit (b-frame).
They do in general not correspond to the position and
attitude of the remote sensing device which is given by the
position and orientation of the camera frame (c-frame). This
frame has its origin in the perspective centre of the camera,
its z-axis is defined by the vector of length f between the
perspective centre and the principal point of the
photograph, and its (x,y)-axes are defined in the plane of the
photograph and are measured with respect to the principal
point. The corresponding image vector is therefore of the
form
X Xp
c— _
pP =/Y Yp (4)
—f
C. X
Camera
yc
Figure 2: Coordinate Transformations Between
Airborne Sensors
In case of pushbroom scanners and CCD fram imagers, the
second vector component is replaced by
y* 2 (y - yp)/ ky
where ky accounts for the non-squareness of the CCD pixels.
The resulting modelling equations are
A rI (t) » ,m (0 R7 (0 ( sdR? pC - drP } (5)
where the subscripts and superscripts correspond to the
frames defined above. The additional notations in Equation
(1) are as follows:
bir : : :
dR, 1s the transformation matrix which rotates the camera
frame into the body frame;
is the imaging vector in the c-frame as given by
Equation (4)
dr? is the translation vector between the GPS antenna
centre and the centre of the INS, and
dr“ is the translation vector between the GPS antenna
centre and the perspective centre of the camera.
This equation, in a somewhat simplified form, has been
discussed in detail in Schwarz et al (1993). A few remarks
will therefore suffice here. It should be noted that the origins
of the position and attitude sensors are not identical.
Furthermore, the vectors r™ and Ar™, as well as the rotation
a2 m ‘ 2% :
matrix R, are time dependent quantities while the vectors p^
b
and dr” as well as the matrix dR” are not. This implies that
the aircraft is considered as a rigid body whose rotational and
translational dynamics is adequately described by changes in
Ar! and RE This means that the translational and
rotational dynamics at the three sensor locations is uniform,
in other words, differential rotations and translations
between the three locations as functions of time have not
been modelled. It also means that the origin and orientation
of the three sensor systems can be considered fixed for the
duration of the flight. These are valid assumptions in most
cases but may not always be true.
The quantities Ar™ , Ry and p^ in Equation (1) are
determined by measurement, the first two in real time, the
194
third in post
are determir
mission; for
a by calil
ground cont
changing w
ground. It
assuming
introducing
measurement
device such
the latter t
investigated
measuremen
avoid datum
The above
georeferenci
scanning Sys
systems. The
of the remote
parameters ir
parameters :
determinatior
configuratior
in the next
radargramme
which has to
for motion c
of the georef
process and
Since the pos
will provide
the followin;
5... PI
To achieve t
two major sy