'S in a panoramic
inar image sensor
| can be before it
5. Constraints for
nts, it is possible
' aim is to create
s from the image
images can seem
ntroduced by the
e of the results is
ncentric imaging
ges are identical
to construct a
| ideal geometry.
possible or even
re preferred for
difficult to make
ncentric imaging.
ca 2360 camera
»ystem.
images can be
contrary, if sub-
ctive differences
rected without a
it of perspective
stitched together
s, however, even
ge, stitching can
od to accurately
ction centres in a
otivated by our
ic images where
ual close range
led good results
able to calculate
entric rotation of
2. METHODS
Non-ideal panoramic camera rigs cause offset to projection
centres causing perspective errors. The amount of perspective
error due to such offset is proportional to the distance between
camera system and targets but also to the depth of a target.
In this work, we use developed Matlab simulation program to
calculate the perspective error introduced by a projection centre
offset. We use one camera as a reference with an arbitrary
projection centre location, which we can give as an input into our
simulation program. We can also specify two target planes. One
is closer to the camera and the second is further away. The closer
plane is also smaller so it does not occlude the second plane. The
amount of perspective error is calculated by placing another
camera beside the reference camera with slightly shifted
projection centre location (Figure 1). Furthermore, we are able to
specify any shooting distance, object size and projection centre
offset in all three dimensions of the object coordinate system.
Once we have established the geometry of the system, we can
rotate the camera in order to illustrate the panoramic rotation.
The ideal panoramic image is achieved when there is no offset in
projection centre locations. In such case, the perspective is
constant because the coordinates of projection centres are fixed
during the rotation. If some projection centre offset does exist,
the rotation in relation to the reference projection centre causes
the projection centre locations of sub-images to form a circular
path around the reference projection centre. This creates a set of
unique perspectives which leads to varying perspective errors.
Side Top Perspective
get
ive
Jer
Figure 1. Matlab simulation program showing a geometry with 5
unit offset in X-direction, 20 unit distance to first target plane
and 30 unit distance to second target plane. The first target plane
is illustrated with orange and the second target plane with non-
color grid in the perspective view. Image planes are shown in
blue color.
The simulation program calculates rays starting from the
projection centres of camera locations, going through the corners
of our first target plane and reaching onto our second target
plane. Corner points of the first target plane are named "Point 1"
and "Point 2" in all cases of this article. However, the offset of
the projection centre causes the rays to not reach the same
coordinates on the second plane. This gives us an estimate of the
perspective error at the object space, simulating a real world
situation where we know the offset of the projection centre,
Shooting distance and the target depth. By target depth we mean
the distance between two target planes. In Figure l, we
distinguish the rays that pass through the same point at the first
target plane with different colours (Point 1, red lines; Point 2,
green lines). The distance between two corresponding rays in the
second target plane indicates the amount of shadowing due the
perspective error. This error also causes physical shifts of image
features at the image plane that are detected as misalignment
when sub-images are stitched into a panoramic image.
3. RESULTS
In our simulation, we specify the coordinates as project units
without any prefix but they can be considered as metric units. In
this particular simulation example, our first target plane is 20
units away from the panoramic rotation axis and the distance to
the second target plane is 30 units. A camera constant is four
units and the image plane size is 4x4 units. This translates to
53.13° horizontal and vertical FoV for each sub-image. The Z-
axis is set parallel to the initial attitude of the imaging axis of the
reference camera and the X-axis is perpendicular to the Z-axis
and parallel to the width of the image plane. In addition, the Y-
axis is perpendicular to the X-axis and points up.
A concentric camera rotation forms geometry where there is only
a single projection centre offering common perspective to all
sub-images. Figure 2 illustrates an ideal concentric panoramic
rotation with sub-images taken with 20° increments. In such a
case, there is no perspective error in the object space. The
program plots red and green observation lines for every image
rotation. Concentric image acquisition leads to single lines
(Figure 2) and eccentric image acquisition to a bundle of lines
(Figures 3-6). Orange plane in Figures 1-5 is the first target
plane and the white grid on the background is the second target
plane where we calculate the perspective error. In addition,
simulation always shows the reference camera image plane. This
can be best seen in Figures 5 and 6 and should not be confused to
the image planes belonging to a group formed by an eccentric
rotation.
Point 1
Point 2
Figure 2. Concentric panoramic sub-image acquisition with the
frame camera geometry using 20° rotation increments. Because
this camera setup do not cause perspective errors, all rotation
positions lead to the same simulation rays (red and green lines).