Full text: Proceedings, XXth congress (Part 5)

    
   
  
   
   
   
   
  
  
   
  
  
  
  
  
  
    
  
  
  
  
   
  
  
  
   
   
  
   
   
  
   
   
  
  
  
    
   
  
   
  
  
  
  
   
  
  
   
    
  
  
  
   
     
  
| template 
inbul 2004 
  
SOLUTION TO JOINT ENTROPY AND ITS APPLICATIONS IN REMOTE SENSING 
Xiaokun Zhu', Yonghong Jia? 
? School of Remote Sensing Information Engineering, Wuhan University, 430079 Wuhan, Hubei Province, P. R. 
China 
icerain1213@yahoo.com.cn, yhjia2000@sina.com 
Commission V, WG V/4 
KEY WORDS: Statistics, Application, Multispectral, Quality, Three-dimensional, Method 
ABSTRACT: 
Based on Shannon's theory, joint entropy is a statistical mean of information content and could be employed to evaluate image 
information. However, the computation complexity of joint entropy affects its applications in remote sensing. In this paper, a 
method with an index data structure to solve this problem is introduced, and in comparison with other methods, joint entropy 
caleulated by the new solution reaches on consistent, or even better results in application to optimum band selection and quality 
assessment of image fusion. 
1. INTRODUCTION 
Joint entropy is a definition coming from information theory 
and signal processing, and it is an objective assessment criterion 
of information content (Jiang, 2001). When it is employed in 
remote sensing to access the quality of images, there might be 
the space-time complexity problem existing in the calculation 
of three-dimensional or more dimensional joint entropy (Liu, et 
al., 1999). That is, the common computation by definition takes 
up a lot of storage, which increase space complexity and the 
calculation even cannot be carried out. 
Therefore, a new method for computing joint entropy with 
index data structure is proposed to improve the efficiency, and 
some experiments are conducted to evaluate the performance of 
this algorithm. Then its applications in optimum band selection 
and quality assessment of image are discussed. 
2. SOLUTION TO JOINT ENTROPY 
2.1 Definition 
Shannon was the first person to introduce joint entropy in the 
quantification of information. In his theory, the probabilistic 
concept was employed in modelling message communication 
and he believed that a particular message is one element from a 
set of all possible messages (Shannon, 1948). Joint entropy is a 
statistical mean of the probabilities (uncertainties) from signal 
sources and based on Shannon's theory, the definition of 
discrete multidimensional joint entropy is defined as: 
r 5 
k 
HX YA Z7)s- SUA Pay A Te, Ply, 
iz] j=] n=l 
Az.) (1) 
Where, P(x,,y;,A z,) represents the joint probability of 
X;, y ,, A z,, and r, s, A kis the upper limits of j, j,A ,n. 
From this formula we can get three-dimensional joint entropy 
formula applied to remote sensing images: 
psy dE e T log, PP,P, @) 
i=0 j=0 k=0 
This expression could be employed to evaluate information 
content in application of three bands selection of multispectral 
images and also could be applied to evaluate the quality of the 
colour image that is combined by red, green and blue. In 
calculation of joint entropy, it's necessary to track the number 
of each possible case (probability). According to formula (2), It 
is showed that the total number of the possible information 
combinatorial cases is 256x256x256, and the equivalent space 
is reserved (Figure. 1. a), which leads to the problem of space- 
time complexity. In fact, it is rare that all the 2* cases appear at 
the same time excepting that the size of the image is 2* bits. 
Generally, most of combinatorial cases don’t appear, so it’s no 
need to allocate the memory space for each combinatorial case. 
If we only record the existing cases used for computation of 
joint entropy, the memory space is saved to some extent. 
However, it takes much time to locate the case. 
2.2 Solution to Joint Entropy 
Considering the characteristics of remote sensing images, the 
three bands in one colour image, are generally highly correlated. 
This means that many combinatorial cases are repeated and it’s 
no use to allocate so large space for each case. In order to 
improve the cfficiency of the original algorithm, we introduce 
the index data structure (Figure. 1. b) that could record every 
existent case efficiently. According to this method, an index 
structure with principal indexes and subsidiary indexes is 
established for combinational cases (i, j, k). The principal 
indexes ranked from 0 to 255 in turn represent the grey values 
of the first band in the colour image. The subsidiary indexes 
include three parts: the first and the second part are the grey 
values of the second and the third bands; the third part is the 
number of this combination case. In this way, this algorithm
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.