Full text: XVIIIth Congress (Part B4)

  
paper discusses some trends in spatial database 
technology and implementation and the way in which 
these interact with the trends in update technology. To a 
large extent this interaction promises to be a beneficial 
symbiosis, but some problems still to be solved are 
identified and probed. 
Some key developments in spatial database technology 
are described, including object-orientation and 
versioning. The impact of these on the utility of large 
spatial databases is considered along with further 
developments in client/server architecture, distributed 
processing and Internet access. The critical role of 
database integrity emerges, and this is examined 
particularly in the context of revision and update. The 
parallel tasks of information refinement and enhancement 
are also examined. 
1.2 Data Transfer Standards 
The relevance of emerging trends in data transfer 
standards is briefly addressed, including the transfer of 
update information. Short and longer term approaches to 
multiproduct databases, supporting both a range of 
cartographic products and a range of digital data 
products, are discussed with particular reference to 
update processes. Finally the view is advanced that by 
the Millennium the convergence effect that is already 
apparent in update technology, both from field sources 
and from imagery, will have resulted in such a degree of 
standardisation that there will be very little in the way of 
discriminating factors between rival offerings. The 
discriminating factor will be the database, and the ease 
and efficiency with which any particular update system 
can integrate with it. 
2. TRENDS IN SPATIAL DATABASE TECHNOLOGY 
2.1 Value in Information 
Investors in spatial databases, whether they be 
governmental agencies or commercial companies, are 
being driven to increase the value of their information. In 
addition to refining its accuracy and currency they seek 
to add to its utility and range of application. Ideally they 
seek to do this without having an ever increasing 
proliferation of independent databases, all demanding 
costly maintenance. In practice, this aim is not 
immediately achievable. A viable approach is to reduce 
the number of independent databases, and to automate, 
or at least orchestrate, the posting of updates across 
them. Software tools which identify, validate and codify 
changes together with flowline management tools and 
data models which provide means of administering 
changes provide the essential framework. A conceptual 
model is shown in Figure 1. 
956 
2.2 Object-Orientation 
Object technology provides a very productive approach 
to increasing the value of spatial information. Modelling 
of the subject is in terms of objects that closely fit the 
real world, rather than just in terms of geometries and 
relational tables. Object-orientation (O-O) goes a step 
further, and allows the behaviours of objects to be 
modelled. O-O databases and O-O programming 
languages are well-established in mainstream Information 
Technology (Informatics) and are being effectively 
deployed to handle large volumes of spatial information 
in a very flexible manner. In particular O-O spatial 
databases provide very efficient support for topology and 
spatial generalisation, and have very positive scaleability 
characteristics. They can provide large area support from 
a common base for both a range of map scales and a 
diversity of data products (e.g. link-node transportation 
networks and polygonised boundary data). An overview 
of Object-Orientation in the context of Geographic 
Information is to be found in (Woodsford, 1995) and a 
more popular treatment from a general management 
perspective in (Taylor, 1990). 
2.3 Versioning 
Database Versioning is a second crucial advance. It 
provides support for very large continuous spatial 
databases in an economical and manageable fashion. 
Versions are maintained in a tree structure, with only 
change information recorded, rather than complete 
copies. Multiuser update is supported, with each update 
process having its own logical copy of the complete 
dataset, without the overheads of providing a physical 
copy to each process. The concepts are illustrated in 
Figure 2, which also shows how versioning can be used 
in conjunction with long transaction support. An Update 
process (or user) has exclusive write access to a defined 
segment, which is simply a logical set of objects within 
the database. The set may be defined by a spatial extent, 
a set of object classes or any other logical rule. 
Segments have to be logically exclusive, although any 
process can have read access to the whole data. 
Updates can be validated for internal correctness as they 
are generated, and validated for consistency with the 
database as a whole prior to merging the updated 
version into the master data. The mechanism for 
validation is the use of methods, which are the general 
mechanism in an O-O system for invoking behaviours. 
General and specific validation methods can be built into 
the object database schema (i.e. not at the application 
programme level) and used to ensure integrity of the data 
across update processes. 
A powerful concept is that of the Object Lifecycle, 
illustrated in Figure 3. This provides a framework for 
managing all phases of the life of an object, including 
updates over time, by defining methods to be invoked at 
each relevant stage in transactions involving objects, and 
ensuring they are invoked. The mechanism is analogous 
to that of triggers in relational databases. 
International Archives of Photogrammetry and Remote Sensing. Vol. XXXI, Part B4. Vienna 1996 
ste 
wit 
ad 
en: 
prc 
enm 
ime 
3.2 
Thi 
an 
jus 
hig 
inc 
tha 
evc 
(ret 
refi 
it w 
enc 
to t 
The 
har 
usu 
The 
car 
giar
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.