International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Volume XXXIX-B4, 2012
XXII ISPRS Congress, 25 August — 01 September 2012, Melbourne, Australia
Instead, when a service provider fails to keep its database
updated with latest data, it is considered to have incomplete
data. Data types refer to the format of desired data. Even
though the area of geospatial data interoperability has made a
lot of progress, various reasons still exist that lead clients to
request specific type of data format.
Nowadays term of “level” is used to define the resolution of
vector data. VMap (vector-map) is a good example for this.
VMap Level 0 corresponds to approximately 1:1.000.000 scale
and Level 1 to 1:250.000. Especially globally produced vector
data is labelled with levels. A good example to such kind of
global production is MGCP. In that program level-2 data that
corresponds to 1:100.000 or 1:50.000 scale is produced by
different nations. The project represents the most current
evolution of a 10-year, global VMap Level 1 effort that began
in 1993 and was revamped in 2003. In such kind of projects,
quality becomes more important, mainly to prevent the
production of non-harmonized data between the participant
countries.
Vector data needs a lot of work and maintenance to ensure that
it is accurate and reliable. Inaccurate vector data can occur
when the instruments used to capture the data are not properly
set up, when the people capturing the data aren't being careful,
when time or money don't allow for enough detail in the
collection process, and so on. If you have poor quality vector
data, you can often detect some of these quality lacks when
viewing the data in a GIS. (wwwl, 2011)
The usefulness of the quality measures depends on the
application. It is not always clear to decide on how many
quality parameters can be introduced to describe the quality of
data. The number of quality parameters can be very large
because quality varies spatially and temporally. Defining the
quality measures is already a very actual topic in the
standardization process. (Ragia, 2000)
Level of the quality should be adjusted carefully. Quality and
efficiency or productivity are conflicting or opposite aspects. If
the quality of the product is selected very high, it increases the
quality control period and decrease the productivity. The
relation between these objects can be seen as in Figure 1. So
the quality, time for quality control and productivity should be
optimized according to the needs.
Q
@ me =
7 >
QC Process
Productivity
Figure 1. Quality vs QC Process and productivity relation
2.3 Quality Assurance of Vector Data
For rating the quality of geodata certain set of measures are
needed, which give us expressive, comprehensive and useful
criteria. À coarse subdivision of quality measures into two
categories can be done, which due to the following arguments
are important for practical applications:
1. Quality measures that concern consistency with the data
model,
2. Quality measures that concern consistency of data and
reality within the scope of the model.
A complete check of the first category can be performed
automatically within a database or GIS without any additional
data. This inspection can be done exhaustively, i.e. the whole
area covered by the data can be checked. On the other hand the
comparison of data and reality is much more expensive.
Performing it for the whole area requires much more effort.
(Busch, 2002)
In this paper quality control procedures, which are used for
MGCP production in General Command of Mapping, to
accomplish the certain quality measures and the experiences
from these procedures are discussed. Our quality control
procedures consist of parameters specifying the following
quality aspects: topology, geometry, completeness of features
and attributes, logical consistency. To succeed these aspects
some automatic, semi automatic or manual quality checks are
used.
The data quality is assured in feature and feature-class levels.
The error inspection procedure comprises consistency,
completeness and correctness control categories. QA software
is the main QC tool for consistency checking and topology.
This software checks the topology errors, some geometry errors
like connection, overlaps etc, attribution conformity and
compatibility errors. Some of the found errors are corrected
later automatically or manually but without checking the error.
Some of the found errors should be checked over the data and
other sources since they can be false positives, which means
that they seem as errors but in fact they are not errors. The
completeness of geometry and attributes is tried to be
guaranteed based on four levels control approach.
At the first level, vector data is controlled over the source
imagery and other ancillary sources. At this stage the captured
data is controlled according to the technical prerequisites. A
control stuff checks all captured data and looks at; 1) if the
features on the topography captured as the correct features in
feature and attribute dictionary, 2) if the correct attributes are
assigned to the captured features, 3) if the feature captured
with the correct geometry according to its size (point, line or
area), 4) if the features corrected with the needed density, 5) if
the features captured with the needed geometric location
accuracy. In Figure 2, a dry river found in this control, that is
not captured but should be captured according to the defined
standards, is shown. This control takes approximately 95 to 10
of the production time of data capture and approximately 965
additional conditions of the captured data are detected.
24
TI
pr