S system
s Ltd. of
a. CARIS
S system,
abilities to
nyi, 1991;
hic digital
S software
ta capture,
tion and
creation,
a analysis,
nated map
dressed in
e analysis.
'ommands
the colour
age, select
> selected
nages and
'etation is
hancement
n or user
ar stretch,
lirectional
egistration
ng, ortho-
sification
ot project
on 4.24
| 10.
'S MAPS
ct were to
] data set,
a set, and
using the
1:250 000
ing to the
sen as the
area also
coverage.
revised in
ard copy
imagery
sheet was
‘a current
ılts of the
with the
materials
were the
ital data,
reference points, digital Landsat TM imagery,
aerial photography and the NTS maps at both
1:50 000 and 1:250 000 scales.
3.1 Derivation through the Generalization of
the 1:50 000 Digital Data
For this project the vector data files were loaded
in SIF format and converted into the CARIS
data files required for processing and display.
Based on the review of the procedures currently
being used for the manual derivation of a
1:250 000 NTS map from 1:50 000 maps, the
methodology of deriving one layer at a time was
used to digitally derive the pilot area. The
following sections describe the steps of the
derivation of each layer.
3.1.1 Review of the 1:250 000 Specifications
To determine the required content and minimum
size for the derived data set, the 1:250 000
Polychrome Mapping Specifications were
reviewed. Content refers to the features that are
shown on the 1:250 000 map and minimum size
refers to the smallest area or line to be shown.
To complement the specifications which are
currently being revised, the project team relied
quite strongly on the directives and minimum
size guide being used in the Map Revision
Section.
3.1.2 Data Generalization
The generalization process used for the
derivation of the 1:250 000 data set from
1:50 000 data consisted of the following eight
techniques (Mackaness, 1991):
a) Feature correlation/selection:
Having identified the required content, the two
data sets were first correlated in order to match
corresponding features. A direct correlation of
feature codes was not possible because the
feature codes for the 1:250 000 data are not all
the same as those used for the 1:50 000 data.
Next, the content required for the 1:250 000
data set was extracted from the 1:50 000 data set
Finally, the feature codes of the selected data
were changed where necessary. Caution must
be taken during this stage not to delete any
features that are required for data continuity,
such as dugouts that are located on a river. The
correlated data was separated into the layers;
hydrography, transportation, vegetation and
wetland, culture, built-up area, contours, and
gravel pits, and layer numbers assigned.
b) Data filtering:
The Douglas-Peucker filter algorithm was
applied to the transportation layer at 50m and to
the hydrography layer at 25m intervals. These
filtering intervals were tested and chosen to
ensure that the data retained its shape while
reducing the amount of data points, which in
turn accelerated the data handling and
processing.
c) Omission of features not meeting minimum
size requirement:
Using the minimum size guide, features that
were too small, too narrow, or too close
together were identified and interactively
removed from the data set. The exception being
those features that could be grouped together or
that were required for data continuity such as
transformer stations on power lines.
d) Grouping of features:
Polygons not meeting the minimum area
requirement were interactively grouped together
or buffers were generated surrounding them.
e) Collapse of features:
The collapsing technique was first used when a
feature did not meet the minimum size
requirement to be shown as one type (i.e., an
area) but could be represented as another type
(i.e., a point). In the cases of a double line
river that did not meet the minimum size, a new
line was digitized representing the river center
line, and a campground area that was not large
enough to satisfy the minimum requirement was
deleted and replaced with a point type. A
second situation where collapsing was required
was where a feature, such as an interchange
was represented as one type at the 1:50 000
scale, but as another type at the 1:250 000 scale.
f) Combination (reclassification) of features:
Reclassifying features for generalization means
changing their feature code in order to connect
the feature with an adjacent feature. Segments
of lines not meeting minimum length
requirements such as ditches at the end of a
stream were reclassified to stream.
g) Simplification (line smoothing filter) of data:
Line smoothing was applied to features that
were jagged to make them cartographically
presentable. It was necessary to smooth the
streams from the 1:50 000 data for presentation
at the 1:250 000 scale.
289