ach
lling
the
and
step
be
ing
tion
vely
tion
atial
and
ility
rom
da
tion
iew
)hic
da
ent
it
tial
on.
nly
ete
ate
ete
ical
| in
sed
ate
ed
hin
nt,
ile.
any
ley
the
led
for
ral
rs.
nto
ion
ata
le,
for
ng
Validation methods can be used to confirm local data
integrity (i.e. within the current segment). Merge methods
are particularly important as they confirm global integrity
before the final commit at the end of a long update
transaction. Note that caching is used to rollback any
transaction that fails validation checks prior to the final
commit.
Since the version holds an explicit record of all changes
made in the transaction, the merge method can readily
summarise the changes for the purposes of management
information and for transmission (in digital form) to other
update processes. The merge methods can also update
any relevant metadata records within the database, to
record the status of any changes made, and create any
history objects that may be required to enable recover of
previous states of the data. In short, merge methods, as
well as ensuring overall data integrity, play a crucial role
in the overall information flows shown in Figure 1.
3. TRENDS IN UPDATE TECHNIQUES
3.1 Systems
With the advance of digital techniques, a general
paradigm for the update process is becoming
established, based on the display of existing information
against a backdrop of new source information, or
selections of new source information. This is achievable
using standard workstations or personal computers (soon
even the most demanding image interpretation and
stereo information extraction tasks will be accomplished
without recourse to special hardware). These hardware
advances are paralleled by advances in software
engineering such as those supporting co-operating
processes, and the strong move to Open IT
environments. Of recent note is the strong take-up of the
GEOTIFF standard, to provide workable registration of
imagery data.
3.2 Data Models
The update process in this paradigm involves display of
and interaction with all the relevant current data i.e. not
just geometry and attributes, but topology, metadata and
higher level relationships. The more success there is in
increasing the information content (value) of the data
model, the more the update process has to interact with
that data model to maintain its integrity. In practice, the
evolution of the data will involve not just update
(reflecting real world change), but also accuracy
refinement (and hence metadata update). In many cases
it will also take place against a background of "data re-
engineering", as progressively more complexity is added
to the data model to reflect the demands to add value.
The life-span of the data already exceeds that of the
hardware and software systems on which attention
usually focuses, and this trend will accelerate.
The evolution of the data model must be managed in
carefully controlled stages, hopefully few in number ("one
giant step?") A key element within this is the
957
management of the evolution of the update process, and
the provision of sufficient metadata records for tracking
purposes. The appropriate amount of such metadata is
still a subject of concern and debate. The balance has to
be struck between fitness for purpose, and the possibility
of overwhelming the data with metadata. The same
issues arise in considering how much history data to
retain. As pointed out at the end of section 2.3 above,
the technology now exists to retain records of all changes
in such a manner that previous states of objects can be
restored. It may of course be uneconomic or unnecessary
to do so!
3.3 Orthoimages
Against this background, the full significance of digital
ortho-images can be seen - as digital source documents
to be assessed and referenced along with other digital
source documents by processes that interact with and
update the database. Their value lies in the ease with
which they can be handled by such processes - they
demand no special status and fall in naturally with the
evolution of the data model and the database integrity
maintenance mechanisms. To similarly exploit the
benefits of richer sources of imagery (e.g. stereo), the
interpretation process has to be closely coupled with the
database. This is in principle possible and is sometimes
achieved, butrequires an open architecture on both sides
(or a closed proprietary system covering both aspects!).
There may of course be perfectly valid operational
reasons for retaining the update process detached from
the database (e.g. it may be government policy to
contract out some of the activity), in which case the
issues arising have to be addressed. These in fact
broadly mirror the issues that have to be solved in
managing the release of updates to users.
4. TRANSFER STANDARDS AND OPEN
ARCHITECTURES
4.1 Data Exchange and Incremental Update
A number of developments are occurring to respond to
the issues raised above. In some user communities, it is
possible to propound a single data model of sufficiently
wide utility. Such data models are usually object-based.
Transfer standards can be defined, both for data
exchange and for incremental update. Incremental
update hinges on a working scheme for unique object
identification ("object-ids"). In such a situation, it is quite
feasible to extract a portion of the database for detached
update, although integrated update is likely to be more
efficient. The key technical challenge is to devise
workable schemes for unique object-ids, particularly
where there are multiple issuers/owners of data.
Solutions have been proposed, but are not yet visible at
the working level.
However the restrictions arising from such a centralised
approach to the data model are onerous for a wider user
community. Recent work on data transfer standards (as
exemplified by the draft European (CEN TC287) transfer
International Archives of Photogrammetry and Remote Sensing. Vol. XXXI, Part B4. Vienna 1996