mmunity.
as much
äther the
ons held
arth data'
ses some
uring the
5 a large
ing from
pecific to
used in a
ed to that
n of data
fined by
r specific
cess may
hat it is
iple, both
t sensors,
e into the
in 1995),
at least a
technique
a fusion
have been
7, 1998).
nferences
lefinitions
ing some
nsors and
0 concept
ramework
it ranges
cannot be
ifficult to
ie features
lassifying
. In this
ergy, and
d not be
te to most
stricted to
, since we
ion. Based
definition
which are
yriginating
mation of
ality’ will
framework
lying data
as is done
e but they
1g also an
missing in
f the most
ng. It is a
in is more
satisfactory for the « customer » when performing the fusion
process than without it. For example, a better quality may be an
increase in accuracy, or in the production of a more relevant
information.
In this definition, spectral channels of a same sensor are to be
considered as different sources, as well as images taken at
different instants.
It then has been suggested to use the terms merging,
combination in a much broader sense than fusion, with
combination being even broader than merging. These two terms
define any process that implies a mathematical operation
performed on at least two sets of information. These definitions
are very loose intentionally and offer space for various
interpretations. Merging or combination are not defined with an
opposition to fusion. They are simply more general, also
because we often need such terms to describe processes and
methods in a general way, without entering details. Integration
may play a similar role though it implicitely refers more to
concatenation (i.e. increasing the state vector) than to the
extraction of relevant information.
Another domain pertains to data fusion: data assimilation or
optimal control. Data assimilation deals with the inclusion of
measured data into numerical models for the forecasting or
analysis of the behaviour of a system. A well-known example of
a mathematical technique used in data assimilation is the
Kalman filtering. Data assimilation is daily used for weather
forecasting.
Fusion may be performed at different levels: at measurements
level, at attribute level, and at rule or decision level. These
terms as well as others related to information are defined in the
following. These definitions are those used in information
theory and have been found in several publications (e.g.,
Bijaoui 1981; Lillesand, Kiefer 1994; Kanal, Rosenfeld 1981;
Tou, Gonzalez 1974).
Measurements are primarily the outputs of a sensor. It is also
called signal, or image in the 2-D case. The elementary support
of the measurement is a pixel in the case of an image, and is
called a sample in the general case. By extension, measurement
denotes the raw information. For example, a verbal report is a
piece of raw information, and may be considered as a signal. In
remote sensing, in the visible range, the measurements are
digital numbers that can be converted into radiances once the
calibration operations performed. If corrections for the sun
angle are applied, one may get reflectances which are still
considered as signal.
An object is defined by its properties, e.g., its colour, its
materials, its shapes, its neighbourhood, etc. It can be a field, a
building, the edge of a road, a cloud, an oceanic eddy, etc. For
example, if a classification has been performed onto a
multispectral image, the pixels belonging to the same class can
be spatially aggregated. This results into a map of objects
having a spatial extension of several pixels. By extension, a
pixel may be considered as an object.
An attribute is a property of an object. For example, the
classification of a multispectral image allocates a class to each
pixel; this class is an attribute of the pixel. The equivalent terms
label, category or taxon are also used in classification. Another
well-known example is the spatial context of a pixel, computed
by local variance, or structure function or any spatial operator.
This operation can be extended to time context in the case of
time-series of measurements. Equivalent terms are local
variability, local fluctuations, spatial or time texture, or pattern.
By extension, any information extracted from an image (or
mono-dimensional signal) is an attribute for the pixel or the
object. The aggregation of measurements made for each of the
elements of the object (for example, the pixels or samples
constituting the object), such as the mean value, is an attribute.
Some authors call mathematical attribute such attribute deriving
Intemational Archives of Photogrammetry and Remote Sensing. Vol. XXXII, Part 7, Budapest, 1998
from statistical operations
equivalent to attribute.
The properties of an object constitute the state vector of this
object. This state vector describes the object, preferably in an
unique way. The state vector is also called feature vector, or
attribute vector. The common property of the elements of the
state vector is that they all describe the same object. If the
object is a pixel (or a sample), the state vector may contain the
measurements as well as the attributes extracted from the
processing of the measurements.
Works in pattern recognition have drawn an analogy with the
syntax of a language. Terms of higher semantic content have
been defined, such as rules and decisions. Rules, like the syntax
rules in language, define relationships between objects and their
state vectors, and also between attributes of a same state vector.
Rules may be state equations, or mathematical operations, or
methods (that is a suite of operations, i.e. of elementary rules).
They are often expressed in elaborated language. Known
examples of such rules are those used in artificial intelligence
and expert-systems. Decisions result from the application of
rules on a set of rules, objects and state vectors. Fusion may
also be performed on decisions.
A fusion system can be a very complicated system. It is
composed of sources of information, of means of acquisition of
this information, of communications for the exchange of
information, of intelligence to process the information and to
issue information of higher content. The issues involved may be
separated in topological and processing issues. Despite the
interconnection between both issues in an integrated fusion
system design, they can be decoupled from each other in order
to facilitate the development of a systematic methodology of
analysis and synthesis of a fusion system (Thomopoulos 1990,
1991).
The topological issues address the problem of spatial
distribution of sensors, the communication network and issues
for the exchange of information, the availability and reliability
of information at the time of the fusion. The cost of acquiring
the information may also be relevant to the topological issues.
In remote sensing, these issues are partly adressed by the space
agencies and by the image vendors. It is also partly adressed by
the customer, given its objectives and constraints, including the
financial budget.
The processing issues address the question of how to fuse the
data, ie. select the proper measurements, determine the
relevance of the data to the objectives, select the fusion
methods and architectures, once the data are available.
on measurements. Feature is
5. CONCLUSION
Needs expressed by the remote sensing community in Europe
have led to the creation of a SIG on data fusion. This SIG has
tackled the problems of terms of reference. A new definition of
the data fusion is now proposed which emphasises the concepts
and the fundamentals in remote sensing.
Several other terms are also proposed which for most of them
are already widely used in the scientific community, especially
that dealing with information. These terms of reference will be
published on the Web site (www-datafusion.cma.fr) of the SIG.
Besides ensuring the communication between its members and
the dissemination of information, the SIG is now undertaking
an inventory of methods and tools, and is also thinking about
instruments for the assessment of the quality in data fusion.
653