ER SCENES
W, Calgary, AB,
Daejeon, 305-350,
i| cameras are not yet
me these drawbacks.
1s the scanner moves.
cing system (interior
ase (e.g., providers of
h the rigorous model,
eterization in indirect
can be used as an
rrow angular field of
ted number of ground
on parameters of the
ddition, forward and
ematical relationship
; using synthetic data
eloped mathematical
ition regarding linear
ion 2. In cases where
ction 4 sets up their
model. Due to the
oped transformations,
using synthetic data.
conclusions and
)
y Scanners
the data using a two-
iited number of pixels
ders their application
nctions in comparison
oreasing the principal
| increase the ground
overage. On the other
ill increase the ground
ion.
array scanners) can be
nd maintain a ground
nalogue photographs.
isional image (narrow
chieved by moving the
capturing more 1-D
International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Vol XXXV, Part B3. Istanbul 2004
images. The scene of an area of interest is obtained by stitching
together the resulting 1-D images. It is important to note that
every 1-D image is associated with one-exposure station.
Therefore, each 1-D image will have different Exterior
Orientation Parameters (EOP). A clear distinction is made
between the two terms "scene" and "image" throughout the
analysis of linear array scanners, Figure 1.
An image is defined as the recorded sensory data associated
with one exposure station. In the case of a frame image, it
contains only one exposure station, and consequently it is one
complete image. In the case of a linear array scanner, there are
many 1-D images, each associated with a different exposure
station. The mathematical model that relates a point in the
object space and its corresponding point in the image space is
the collinearity equations, which uses EOP of the appropriate
image (in which the point appears).
In contrast, a scene is the recorded sensory data associated with
one (as in frame images) or more exposure stations (as in linear
array scanners) that covers near continuous object space in a
short single trip of the sensor. Therefore, in frame images, the
image and scene are identical terms, while in linear array
scanners, the scene is an array of consecutive 1-D images.
M»
| | |
(a)
l
y
. Column 8
ML. i 3
12 i n or time
(b)
Figure 1. A sequence of 1-D images (a) constituting a scene (b)
2.2 Rigorous Modelling of Linear Array Scanners
Rigorous (exact) modelling of linear array scanners describes
the actual geometric formation of the scenes at the time of
photography. That involves the Interior Orientation Parameters
(IOP) of the scanner and the EOP of each image in the scene.
Representation of EOP, adopted by different researchers (Lee
and Habib, 2002; Lee et al., 2000; Wang, 1999; McGlone and
Mikhail, 198; Ethridge, 19771), includes:
* Polynomial functions - This is motivated by the fact
that EOP do not abruptly change their values between
consecutive images in the scene, especially for space-
based scenes.
* Piecewise polynomial functions - This option is
preferable if the scene time is large, and the variations
of EOP do not comply with one set of polynomial
functions.
* Orientation images — In this case, EOP of selected
images (called orientation images) within the scene
are explicitly dealt with. EOP of other images are
functions of those of the closest orientation images.
This option also avoids using one set of polynomial
functions throughout the scene.
* Non-polynomial representation — This option
explicitly deals with all the EOP associated with the
involved scene. Linear feature constraints can be used
to aid independent recovery of the EOP of the images
as well as to increase the geometric strength of the
bundle adjustment.
EOP are either directly available from navigation units such as
GPS/INS mounted on the platform, or indirectly estimated using
ground control in bundle adjustment (Habib and Beshah, 1998;
Habib et al, 2001; Lee and Habib, 2002). For indirect
estimation of the polynomial coefficients using Ground Control
Points (GCP), instability of the bundle adjustment exists,
especially for space-based scenes (Wang, 1999; Fraser et al.,
2001). This is attributed to the narrow Angular Field of View
(AFOV) of space scenes, which results in very narrow bundles
in the adjustment procedures. For this reason, a different model,
parallel projection, will be sought for the analysis in Section 3.
3. PARALLEL PROJECTION
3.1 Motivations
The motivations for selecting the parallel projection model to
approximate the rigorous model are summarized as follows:
e Many space scanners have narrow AFOV - e.g., it is
less than 1? for IKONOS scenes. For narrow AFOV,
the perspective light rays become closer to being
parallel.
Space scenes are acquired within short time — e.g., it
is about one second for IKONOS scenes. Therefore,
scanners can be assumed to have same attitude during
scene capturing. As a result, the planes, containing the
images and their perspective centres, are parallel to
each other.
e For scenes captured in very short time, scanners can
be assumed to move with constant velocity. In this
case, the scanner travels equal distances in equal time
intervals. As a result, same object distances are
mapped into equal scene distances.
Therefore, many space scenes, such as IKONOS, can be
assumed to comply with parallel projection.
3.2 Forms of Parallel Projection Model
Figure 2 depicts a scene captured according to parallel
projection. Scene parallel projection parameters include: two
components of the unit projection vector (L, M); orientation
angles of the scene (c @, x); two shift values (Ax, 4y); and
scale factor (s). Utilizing these parameters, the relationship
between an object space point P(X, Y, Z), and the corresponding
scene point p(x, y ) can be expressed as:
X L X Ax
; 1
y|=s2R"| M |+sR"| Y |+| Av en
0 N Z 0
where:
R is the rotation matrix between the object and scene
coordinate systems;
N is the Z-component of the unit projection vector - i.e.,
N 2 N1- D? - M? ;and
À is the distance between the object and image points,
which can be computed from the third equation in (1).