1 2004
—— —
ly if a
metric
ig the
jj ugate
metric
image
/. This
points
rithms
ds, for
metric
ion of
tically
ch and
linear
for its
e the
re can
butes.
set of
ylution
de the
higher
cesses
'hange
linear
. from
linear
tration
when
cover,
enters
s. The
errors
priate
action
Areal
atures
iti Ves,
tration
Ossess
ice as
] ytical
metric
International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Vol XXXV, Part B4. Istanbul 2004
functions) or by a free form shape. Straight-line segments have
been chosen as the registration primitives for the following
reasons:
e Straight lines are casier to detect and the correspondence
problem between conjugate features in the input imagery
becomes easier.
e [t is straightforward to develop mathematical constraints
(similarity measures) describing the correspondence of
conjugate straight-line segments.
« Free-form linear features can be represented with sufficient
accuracy as a sequence of straight-line segments (polylines).
It should be mentioned that proposed approach in this paper
doesn’t require end points corresponding between conjugate
line segments
Once straight lines are adopted as the most suitable primitive to
be used in the registration process, the next step is to select a
valid and proper transformation function that can faithfully
represent the transformation between the conjugate straight
lines identified in the input and reference images.
3. REGISTRATION TRANSFORMATION FUNCTIONS
At this stage, one should establish a registration transformation
function that mathematically relates geometric attributes of
corresponding primitives. Given a pair of images, reference and
input images, the registration process attempts to find the
relative transformation between these images. The type of
spatial transformation needed to properly overlay the input and
reference images is one of the most fundamental and difficult
tasks in any image registration technique. Images involved in
the registration process might have been taken from different
viewpoints, under different conditions, using different imaging
technologies, or at different times. The registration
transformation function must suit multi-resolution and multi-
spectral images that might have been captured under different
circumstances.
There has been an increasing trend within the photogrammetric
community towards using approximate models to describe the
mathematical relationship between the image and object space
points for scenes captured by high altitude line cameras with
narrow angular field of view (e.g, IKONOS, SPOT,
LANDAST, EROS-Al, QUICKBIRD, and ORBVIEW).
Among these models, Rational Function Models (RFM) are
gaining popularity since they can handle any type of imagery
without the need for a comprehensive understanding of the
operational principles of the imaging system (Tao and Hu,
2001). RFM are fractional polynomial functions that express
the image coordinates as a function of object space coordinates.
RFM have been extensively used in processing satellite scenes
in the absence of the rigorous sensor model (e.g., IKONOS
scenes). However, using RFM would not allow for the
development of a closed form transformation function between
the coordinates of conjugate points in the reference and input
Images.
For scenes captured by high altitude line cameras with narrow
angular field of view, parallel projection approximates the
mathematical relationship between image and object space
coordinates (Habib and Morgan, 2002). Image to object space
coordinate transformation using parallel projection involves
eight parameters. For relatively planar object space (i.e., height
variation within the object space is very small compared to the
flying height), the parallel projection can be simplified to an
affine transformation involving six parameters. In other words,
corresponding images (either in the reference or the input
image) and the planimetric object coordinates are related
through a six-parameter affine transformation. Due to the
transitive property of an affine transformation, the relationship
between corresponding coordinates in the input and reference
images can be represented by an affine transformation as well.
For situations where the image is almost parallel to the object
space, the affine transformation function can be approximated
by a 2-D similarity transformation. Once again, since similarity
transformation is transitive, coordinates of conjugate points in
the reference and input image can be related to each other
through a 2-D similarity transformation, Figure 2.
Parallel projection : ar Flight directions
p^ Nc
Narrow AFOV y plonar surface — Parallel image-object
High altitude | |
Parallel projection Standard affine 2D-similarity
Figure 2. Approximate models
After discussing the choice of the most appropriate registration
primitives as well as the transformation function between the
reference and input images, one can proceed to the third issue
of the registration paradigm: the similarity measure.
4. SIMILARITY MEASURE
The similarity measure, which mathematically describes the
coincidence of conjugate line segments after applying the
registration transformation function, incorporates the attributes
of the registration primitives to derive the necessary
constraint(s) that can be used to estimate the parameters of the
transformation function relating the reference and input images.
In other words, having two datasets, which represent the
registration primitives (straight-line segments) that have been
manually or automatically extracted from the input and
reference images, one should derive the necessary constraints to
describe the coincidence of conjugate primitives after applying
the appropriate registration transformation function.
Transformation Function
4
"yn,
Figure 3. Similarity measure using straight line segments
Let's assume that we have a line segment (1-2) in the reference
image, which corresponds to the line segment (AB) in the input