ally,
its
also
idual
the
j--to
larly
data
data
nted.
ments
like
lines
| the
(d on
^s of
be
re be
that
is of
losed
d as
n of
that
efore
n one
that
tural
ed as
r the
most
lated
of
umber
which
on or
aces,
(Del
rtion
e for
used,
codes
stics
the
From the point of view of numerical
graphics, the elements were divided into
three categories:
3DPOINT Points defined with three coordi-
nates;
3DPOLY Three dimensional polylineal,
generally closed;
3DFACE Three dimensional surfaces of
which the edges and cha-
racteristic points are defined.
In this case successive automatic
processings make it possible to
perform either polynomial
approximation to define the trend
of the surface equation, or
identify plane triangles to
create triangulated structures
with successive approximation for
the final computerized display.
The first code described (PERCONCI) is
the basic one that identifies the points
on the border of the basic unit, while
the second (INTCONCI) codes the points
with heights within the quoin.
The other surfaces identified within the
quoin were then, as already stated,
plotted as closed and coded polylineals,
taking account of the different style or
constructional characteristics of the
portion of monument examined.
To complete the restitution phase, the
operation of graphics data editing,
managed by the same software that manages
the plotter (MACROS), becomes
particularly useful if one works with the
restitution system described above.
It is possible to perform a whole series
of operations on the unit (partial or
total cancellation, displacement of
vertices, construction of a unit
connecting isolated points and so on)
which are often necessary for correction
Of errors or to make up for the
inevitable omissions in the restitution
phase. A typical example is the
construction (made mandatory in editing)
of the unit referring to the lateral
surfaces of the architectural
decorations, obtaining by joining single
points belonging to different planes.
At the end of these operations one
obtains an ASCII file of coordinates
which forms the starting point for
subsequent processing.
4. DATA THINNING,
The research also revealed the importance
of defining modalities of distribution
and appropiate density for points
acquired, in order to achieve the best
ratio between amount of data and
Closeness to reality, according to the
scale of restitution desired. The
question becomes particularly important
in architectural work, in which a wrong
density could lead to definition of lines
differing from the real ones, with all
the consequences that would be entailed
in both historical interpretation and
analysis of statics. Too low a density of
points, for example, could be
counteracted by execution of a subsequent
spline, with a definition of curvatures
that might be completely different from
the original. Attention must therefore be
given to the acquisition phase, ensuring
that it is compatible with a possible
subsequent thinning operation. In our
research various techniques have been
used, which are described briefly below,
and methods that can be extended to most
cases have been tested.
4.1 Recording systems
The ordinary acquisition systems used in
numerical plotters, based on various
types of real-time recording (single
point, space increment, time increment,
mixed space and time, vector) involve
different modalities to which there
correspond differing amounts of data
recorded for the same object. Problems of
memory occupation apart, these may also
generate differeing approximations to
reality, depending on both the algorithm
and the increments imposed by the
operator. Even when working on the same
object, it is therefore easy to produce
totally non-homogeneous configurations of
data amounts and approximations.
Hence the need for a procedure to check
and where necessary thin, one that will
make it possible to store data
corresponding to the characteristics of
the object and to specific prescriptions,
in a way fully analogous to acquisition
of non-photographer 2D data, such as
digitalization of existing graphics. In
the restitution performed, a vector
recording system with 5 gon increments
was used. It is interesting to analyze
the methods developed for automatic line
generation in changing the scale of
maps. Researches in the field of pattern
generation, image processing and computer
vision is particularly concerned with the
problems of detecting corners, and its
findings are studied for application in
problems of digitalized data compression.
In our case such research can be adapted
to the problem of finding the principal
angles to insert in the approximate
geometry of finite element calculation.
A method proposed in 1988 (Thapa, 1988)
"zero crossing algorithm" identifies the
critical points to record for different
degrees of generalization due to the
change of scale. Application of this
algorithm, however, is laborious, as it
requires an initial transformation from
vector to raster form (for example by the
Bresenham algorithm (Bresenham, 1965))
and a subsequent filter type analysis to
reduce noise. Perhaps more interesting
for our purpose is the CVD algorithm
(Commutator Vertex Detection, detection
of vertexes by commutating operators) by
Anderson and Bezdeck in 1984 (Anderson et
Al., 1984).
This was tried by Thapa (Thapa, 1990) and
adapted to problems of compressing data