assit f !.. si S
ISPRS, Vol.34, Part 2W2, “Dynamic and Multi-Dimensional GIS", Bangkok, May 23-25, 2001
Statistically speaking, H(X) tells how much uncertainty the
variable X has on average. When the value of X is certain, Pp1,
then H(X)=0. H(X) is at its maximum when all messages have
equal probability.
In communication theory, three types of information are
identified, i.e. syntactic, semantic and pragmatic information.
Indeed, most of researchers in spatial information science try to
follow these three types of information for a map.
2.2 Statistical information of a map: entropy of symbol
types
Sukhov (1967, 1970) has adopted the entropy concept for
cartographic communication. In such a work, only the number of
each type of symbols represented on a map is taken into
account. Let N be the total number of symbols on a map, M the
number of symbol types and Ki is the number of symbols for
type. Then N = + K 2 + ... + K M . The probability for
each type of symbols on the map is then as follows:
(2)
N
where, Pi is the probability for i th symbol type, i =1,2, .... M.
The entropy of the map can be calculated as follows
h(x)p u )=-f; /> in(/>)
(=i
The shortcomings of this measure for map information could be
revealed by Figure 1, which is modified from (Knopfli 1983).
Both map consist of three types of symbols, i.e. roads, buildings
and trees and have exactly the same amount of symbols for each
type. That is, there is a total of 40 symbols, i.e. 7 for roads, 17
for buildings and 16 for trees. Therefore, according to definitions
in Equations (2) and (3), both maps shown in Figure 1 have the
same amount of information, i.e. H=1.5. However, the reality is
that the distributions of symbols on these two maps are very
different. In Figure 1(a), the map symbols are mostly located on
the right side of the diagonal along lower/left to upper/right
direction and the tree symbols are scattered among buildings.
Two rows of buildings are along the main road. However, in
Figure 1(b), there is an area of trees on the left side of diagonal
along the lower/left to upper/right direction and there is an area of
buildings on the opposite direction. The roads are almost along
the diagonal. Indeed, they represent different natures of spatial
reality.
^>0
Figure 1. Two maps with the same amount of symbols but with different distribution
lln other words, the entropy computed in this way only takes into
account the number of symbols for each type but the spatial
arrangement of these symbols is completely neglected. Such a
value is purely statistical and thus is termed as "statistical
information” in this paper. Indeed, it doesn't mean much in a
spatial sense. Therefore, the usefulness of such a measure is
doubtful.
2.3
Topological information of a map: entropy of
neighbourhood
the generation of such a dual graph was put forward by
Rashevsky (1955).
Figure 2(a) shows a dual graph which consists of seven vertices
at three levels. There are three types of vertices if classified by
number of neighbours. There are four vertices with only one
neighbour, one vertex with two neighbours and two vertices with
three neighbours. Then, N=7, M=3, thus, the probabilities of
these three types of vertices are: y, y and y . The entropy
of this dual graph is then computed using Equation (3) and the
result is 1.38.
Neumann (1994) proposed a method to estimate the topological
information of a map. The method consists of two steps: (a) to
classify the vertices according to some rules, such as their
neighbouring relation and so on, to form a dual graph, and (b) to
compute the entropy with Equations (2) and (3). The method for