formulation for a plane
X X xX)7X, X,7X,
VW] il *pglv-b-*'vlv-y (p, veR) (5)
Ze Mn, 7-24 -44
DETECTION OF THE PROFILE
The second geometrical locus is calculable from a
profile position in the computer image by doing the
coordinate transformations of all steps through the
detecting system. Normally, the forward transforma-
tions (according to light propagation) from work-
piece surface in world coordinates (x,y,z), to the
computer image in frame grabber coordinates (x,y);
are discussed. This chapter will show the inverse
transformations from a given position in the compu-
ter image to a ray in world coordinates, which ac-
cords to the given position in the computer image.
The detection of the profile comprises two imaging
steps. First step is optical imaging of the profile
onto the CCD sensor, second step electro-optical
imaging from the sensor into the frame storage of
the host computer.
Transforming computer image to sensor coordinates
Starting from a given position (fy, fy); in the com-
puter image the first step is calculation of the
according position (Px/Py), ON the CCD sensor of the
camera.
Camera and frame grabber are connected according to
the CCIR video standard. Therefore frame grabber
pixels and CCD sensor elements have no definite
correlation and different scale factors for x- and
y-coordinates must be applied. The image is trans-
mitted line by line and the vertical scale factor
is given by the vertical distance d, of sensor ele-
ments. The different number of sensor elements N,,
and frame grabber pixels Ng in each line cause a
horizontal scale factor, which is different from
the horizontal distance d, of adjacent sensor ele-
ments. The transformation formulae from a point
(£y, fy)r in the computer image to the corresponding
point (p,,py,). on the CCD sensor are given by Eq.
(6)
N
Px CE €) d, =
SY Ney (6)
Py| =|(£,-C) 4,
Pz c Zo c
where:
d,,dy distances of sensor elements
Cyr Cy optical axis of the sensor (in compu-
ter image coordinates)
N.N Number of pixels in each horizontal
line of sensor or frame grabber
Zo focal distance between CCD sensor and
objective (z,<0)
The center coordinates C,, C, denote the position
of the optical axis of the sensor in frame grabber
coordinates. They are evaluated applying the direct
optical method described in [10]. This method
is similar to autocollimation. A He-Ne laser beam
is pointed at the front lens of the camera's objec-
tive. Then the camera is adjusted in this way, that
all reflected beams (caused by multiple reflections
on all optical surfaces of the lens assembly) coin-
cide with the primary laser beam. The laser beam is
now aligned with the optical axis of the camera
(the z,-axis) and the computer image coordinates of
the laser beam are the center coordinates C,, C,.
In the presented system, the use of a He-Ne laser
for the autocollimation procedure has a great ad-
vantage. A bandpass interference filter is built
into the CCD camera (between objective and CCD sen-
sor) to suppress ambient light. This filter passes
only laser diode wavelength and attenuates He-Ne
laser wavelength significantly («10^). Hence the
He-Ne laser beam can be imaged on the CCD sensor
without any additional attenuation filter, which
414
probably would cause distortions or lateral shifts
of the He-Ne laser's image and, as a consequence,
errors in the evaluation of the coordinates C,, Cy.
Transforming a sensor position to a ray
Next step is the transformation of a point on the
CCD sensor (given by Eq. (6)) to the corresponding
ray in world coordinates. The transformation is ba-
sed on the simple principle of perspective projec-
tion with pinhole camera geometry. Hence the corre-
sponding ray for a position P(P,,By;P;), on the sen-
sor is the straight line from this sensor position
through the origin O(0,0,0), of the camera coordi-
nate system (which is the principal point of the
objective). Hence the two-point formulation of the
ray in camera coordinates is given by Eq. (7)
x 0 Px
E) = H * A|D, (A€R) (7)
Z/c 0/, D, e
Transforming camera to world coordinates
Perspective projection transforms straight lines to
straight lines. Hence the transformation of a ray
can be done by individually transforming the two
determinating points O(0,0,0), and P (PxrByr Ez) into
the world coordinate system. The transformation of
any point (x,y,z), from camera coordinate system
{XecrYerZ} to world coordinate system (x,,y,,z,) gi-
ven by the well-known Eq. (8)
x cos (x,X;) cos (y,x.) cos (z,x.) |[x
y| =|cos(x,y.;) cos(y,y) costz,y ||y| «|t, (8)
Zl, (cos(x,z. cos(y,z. cos(z z))\z/, t.
w
where the components of the Eulerian rotation ma-
trix are the direction cosines of the base vectors
of the coordinate systems. The translation vector
shows from new origin O, to old origin O,, and is
identical with the translation vector of camera
origin in world coordinates.
Finally, the ray in world coordinates, which corre-
sponds to the point P(P,;ByrP;), On the CCD sensor,
resp. the point (f,,f,) in the computer image, is
determined by the two transformed points P, and O,
in Eq. (9)
X t, cos(x,x.) cos(y,x.) cos(z,x.) |[px
y| *|t,| * A|cos(x,y)) cos (y„Y.) cos (z„y.) ||Py (9)
Z)w Cr) cos (x,z;) cos(y,z;) cos(z,z.) )(Pz),
Eq. (9) gives the world coordinates for one geome-
trical locus of the profile. The intersection of
this ray with the light plane generated by the
scanner will give 3-D world coordinates of one
point of the workpiece surface.
Intersection of imaging ray and light plane
The calculation of intersection point between the
illuminating light plane according to Eq. (5) and
the imaging ray according to Eq. (9) is done by
equating both formulae.
26 273 26579
Vir + ply.-yila wy. -y ls
Z1), 7Zı Je zm.
(10)
t cos(x,x.) cos(y,x.) cos(z,x.) |n,
2|ty,| * A|cos(x,y.) COS(Y Ye) COS (ZLY 0) ||Py
t), cos(x,z, cos(y,z, cos(z,z Jp,
This system of linear equations leads to fixed va-
lues for A, p and v, which give the 3-D coordinates
of the intersection point after insertion into Eq.
(9) (or, alternatively into Eq. (5).