Full text: XVIIth ISPRS Congress (Part B3)

rous 
IQ 
ete 
ith a 
will 
each 
An 
which 
hare 
the 
ce 
even 
nd it 
y." 
The 
be 
ial 
, as 
the 
ete 
rom 
on 
probabilities. Likewise, transitions of 
instantaneous physical measurements from 
one state to another across Spectral 
channels may be described by a finite set 
of transition probabilities. This concept 
is fundamental to the application of 
information theory to multispectral 
imagery. 
If we model a digital, multispectral image 
as a record of a discrete ergodic Markoff 
process, no other mathematical, geometric 
or qualitative assumptions of any kind are 
necessary and operations for visualizing 
the metrics of information theory are 
exhaustively defined. Spectral, spatial 
or other statistical components such as 
illumination intensity or angle which are 
integrated over the image field are 
treated identically. 
Uncertainty, Entropy and Information 
The occurrence of a quantization value or 
a sequence of quantization values in an 
image is an event with an associated 
probability. The probability of an event, 
and the uncertainty associated with that 
probability, are separate and distinct 
concepts. Information theory is founded 
upon uncertainty and it's measurement. In 
information theory, the maximum 
uncertainty possible for any discrete 
event is log 1/N, where N is the number of 
possible discrete event states. For an 
image with a quantization range of 256, 
the maximum possible uncertainty 
measurable for any discrete quantization 
event is log 1/256. For a sequence of 
events in the same image, the maximum 
uncertainty possible for the Sequence 
would be } log 1/256 for the length of the 
sequence. 
The name given the measurement of 
uncertainty associated with a set of 
events is the "entropy." Since maximum 
uncertainty is conveyed by the least 
probable event, the least probable event 
is said to possess the "maximum entropy." 
Entropy and "information" are commonly 
confused, resulting in a famous paradox of 
information theory that only a perfectly 
random source possesses the maximum 
entropy or information content. This 
paradox arises from information theory's 
use of terminology arising from it's 
mathematical roots in classical 
thermodynamics. The paradox disappears if 
we regard entropy as a measure of the 
information required to eliminate the 
uncertainty of an event rather than a 
measure of information, per se. In some 
texts, information is referred to as 
negative-entropy or "negentropy" to 
distinguish it from the Boltzmann entropy 
of thermodynamics. This paper adheres to 
Shannon's original terminology, with the 
above caveats in mind. Entropy is a 
relative abstraction devoid of any 
connection with absolute measurements or 
criteria. Though simple to compute, it 
possesses extraordinary conceptual 
subtlety. 
683 
Computing Visualizations of the 
Information Content 
of Digital Multispectral Imagery 
The first step in the visualization 
process is to scan the multispectral image 
and compute two histograms; a raw 
occurrence histogram and a conditional 
occurrence histogram. The raw occurrence 
histogram is a count of the number of 
occurrences of each quantization value 
within each spectral channel. The 
conditional occurrence histogram is a 
count of co-occurrences of all 
quantization values across all Spectral 
channels. 
The second step in the visualization 
process is the conversion of the raw and 
conditional histograms into tables of 
simple and conditional probabilities which 
represent the simple and conditional 
uncertainties associated with image 
quantization events. It is crucial to 
make the distinction between an absolute 
measure of probability and a measure of 
uncertainty. The maximum uncertainty 
possible for any quantization event in an 
image with 256 possible quantization 
values is 1/256. Therefore, quantization 
probabilities less than 1/256 must be 
converted into measurements of uncertainty 
relative to this probability. For 
example, probabilities of 1/128 and 1/512 
reflect identical uncertainties in an 
image with a quantization range of 256, 
i.e., each represents the same amount of 
information relative to the maximum 
possible uncertainty. The uncertainty 
associated with any quantization value is 
a direct function of the quantization 
range. 
The third step in the visualization 
process is to re-scan the image and 
compute visualizations of the metrics of 
information theory across all spectral 
channels at each instantaneous resolution 
element. If spatial entropy is also to be 
visualized, the computation is summed over 
nearest neighbor, multispectral spatial 
samples.  Mathematically, visualizations 
of the entropy of any number of registered 
spectral channels may be computed but 
memory and display constraints create a 
practical limit of three or four channels 
on small computers. 
Metrics of Information: Entropy, 
Redundancy and Equivocation 
Computation of the simple entropy across 
spectral channels is defined by Theorem 5 
(Shannon, 1949b). The simple entropy 
assumes that probabilities representing 
uncertainties are independent. 
THEOREM 5: Let p(Bi) be the probability 
of a sequence Bi of symbols from the 
source. Let 
Gn - - 1/N2. p(Bi) log p(Bi) 
i 
where the sum is over all sequences Bi 
containing N symbols. Then Gn is a 
monotonic decreasing function of N and 
 
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.