€ Detection of linear features in images.
€ Systematic errors correction.
€ Optimization of the linear features
3.1. Image Smoothing Filter
Smoothing filter is a general notion of transforming a
digitized image in some way in order to improve picture
quality. It mainly consists of removing noise, debluring
object edges, and highlighting some specified features.
The paper use edge-preserving smoothing, which
searches the most homogeneous neighborhood of each
pixel and assigns to it the average gray value of that
neighborhood. The homogeneity is expressed in terms
of variance. When the pixel under consideration lies on
an edge there will be, when moving away, directions
where the variance is low, i.e., the pixel belongs to that
region, and directions with high variance. The principal
notion is to rotate with an interval (e.g. 450), an
elongated mask around the pixel and to compute the
variance of the gray values in the bar. The average of
the gray values of the bar with the smallest variance is
assigned to the pixel.
3.2. Extracting Edges
Edges of objects (e.g. buildings) in an image are
defined as local discontinuities in the gray value
appearance. This may result from a depth discontinuity,
a surface normal discontinuity a reflectance
discontinuity, or an illumination discontinuity in the
scene.
Edge detection has been an important part of many
computer vision systems and is widely described in
textbooks and presented in scientific works. There are
two main types of edge detection techniques which have
been widely described in literature: the differential and
the template matching techniques. The former
performs discrete differentiation of digital image array
to produce a gradient field, in which, at each pixel,
gradients are combined by a non-linear point operation
to create an edge enhancement array prior to a
threshold operation. The template matching technique
is based on a set of masks representing discrete
approximation to ideal edges of various orientations,
which are applied simultaneously to produce the
gradient field. In that case, the enhancement is formed
by choosing the maximum of the gradient array
corresponding to each mask. For each type of edge
detection technique, a large number of operators have
been proposed by different authors.
In our method, we employed the Sobel operator to
strength the edges and then used the dynamic
programming line following method to extract these
lines.
3.3. Systematic Errors Correction
The edge pixel coordinates are defined in the frame
reference system. These coordinates must be
transformed to the beast positions in the image by
correction systematic errors such as radial distortion,
decentring distortion, scaling difference in horizontal
and vertical directions, translations of the principal
26
point. The systematic errors are corrected using the
llowing equations:
X, 7 X; — X, * (x, — xyek,er" * (x, — xy) ed,
(18)
Y; ^ Yr— yo * (y; — y) *k,or*
where:
x, and y, are the image coordinates of a pixel
related to the principal point;
x, and y, are the coordinates of the same pixel in
the frame;
x, and y, are the image coordinates of the principal
point;
r is the distance of one pixel to the principal point;
k, is the coefficient of radial distortion (higher
order coefficients and decentring distortion are
neglected);
d, is the scale factor in x.
3.4. Optimization of the linear features
Once the system errors of image have been corrected,
the straight lines in the image plane can be expressed
as the form of equation ( 1 ) with the least square
adjustment at sub-pixel precision.
4. EXPERIMENTAL RESULT
One simulated test and one real have been conducted
in order to check the potential and the effectiveness of
the developed calibration scheme of camera. The
simulated test control lines, which were extracted from
a cube rendered with 3D Studio, were used to describe
the whole procedure of this method. The real date was
used to make a comparing with the point-based method
and the new one presented here. For the former, the
control target was one cubic box. Figure 3 illustrated
the calibrating target. Table 1 listed the simulated
orientation parameters of the simulated camera.
92) 3 (b)
3(c) 3 (d)
Figure 3. Simulated Control Target.
( a ). Original Image of Control Target.
( b ). Edge
( c ). Edge:
(d). Stra1
Leas
Table 1. €
Interior
Orientatio
Parameter
Exteriror
Orientatio
Parameter:
Table 2 liste
and their ol
Table 2. Simula:
10.10
LO
188.19
LI
154.08 |
L2
39.65 .
L3
241.4 ;
LA
172.91. :
L5
42.58 |
L6
238.44 (
[T
235.29, |
The softw
and execute
WINDOWS
optical dist
coordinates;
system erro
principal pc
exterior orie
lines can be
sub-pixel pr
Table 3 1
result.
Table 3.