Full text: Technical Commission VII (B7)

2.4 Near Lossless Compression of Edge Protected Images 
If demand for lossless compression can be loosen to a near- 
lossless compression, the compression rate is expected to be 
further improved. 
The stereo measurements in modern photogrammetry are 
basically accomplished by means of automatic stereo image 
matching. The principal factor affecting the accuracies of stereo 
measurements is the edge features of images. As long as the 
edge features are immune to destructions, the accuracies may be 
expected to be consistent. Therefore, an edge protection 
operator is utilized to appropriately filter out the high frequency 
noises which could help to magnify the signals and further 
improve the compression rates. 
  
  
  
  
  
  
  
  
  
  
Figure 1. Edge Protection Smoothing Operator 
An edge protection smoothing operator is shown in Figure 1, in 
which the black solid circles represent pixels and the red and 
blue polygons represent directional pointers. In a window of 5 
by 5 pixels, the pointers cover eight directions. The steps for 
executing this operator include: (1) computing the variance of 
gray values of the covered pixels in each pointer; (2) treating 
the direction with maximum variance as an edge, while 
regarding the minimum variance as plainness; (3) substituting 
the average value for the central value in the direction with the 
minimum variance; (4) shifting the window until the image 1s 
ergodic. After the application of this operator, the compression 
rate of a certain image is expected to be further improved. 
3. EXPERIMENTS AND ANALYSIS 
3.1 Differential Coding 
Two remote sensing images, as shown in Figure 2, are utilized 
to test the effects of differential encoding. (a) is a Landsat-5 TM 
image consisting of Band 5, Band 4 and Band 3, the spatial 
resolution of which is resampled to 28.5 meters. And the main 
land cover types of this image include water body, farmland, 
built-up areas and mountain areas. (b) is a SPOT-5 
panchromatic image with a spatial resolution of 2.5 meters, in 
which built-up areas and farmland are the main land cover types. 
Both of the images are made up of 500 by 500 pixels. 
The following experimental steps are carried out for the 
aforementioned images: 
1. Compute entropies, auto-correlation coefficients and 
information amount of the images 
2. Apply Huffman coding to the images without any 
decorrelation process. 
3. Utilize differential 
encoding to eliminate the 
International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Volume XXXIX-B7, 2012 
XXII ISPRS Congress, 25 August — 01 September 2012, Melbourne, Australia 
correlation of pixels and encode the difference images. 
  
   
  
   
  
   
  
  
  
  
  
  
  
  
   
   
  
  
  
  
  
  
  
  
   
   
  
  
  
  
  
  
   
  
   
  
   
  
  
  
  
  
  
  
   
   
   
  
   
   
   
  
  
   
   
   
   
   
  
     
   
4. Compute the average code lengths and compression 
rates of the Huffman coding and differential encoding, 
respectively. 
    
  
(a) Landsat-5 T (b) SPOT-5 
Figure 2. Testing images for differential coding 
The results are listed in Table 2. The average code lengths of 
the compressed images are very close to the entropies of the 
original images. It confirms that Huffman coding is definitely a 
good entropy encoding algorithm. Compared to the Huffman 
coding, the average code lengths of the compressed images by 
differential encoding are much more close to the corresponding 
information amounts. Especially, the average code length of 
compressed SPOT panchromatic image by bidirectional 
differential coding is 3.66, which is very close to the 
information amount of 3.44. Moreover, the compression rate is 
nearly two times as large as that of Huffman coding. By means 
of the information measures, analysis on data characteristics 
and a sufficient consideration of the correlativity of data will 
help to find effective data transform algorithms, which is 
capable of reducing the data redundancy and improving the 
compression rates of lossless compression. 
Both of the unidirectional and bidirectional differential coding 
algorithms are first order. Although the average code lengths of 
the compressed images are closer to the information amounts of 
the original images, the still remained difference manifests an 
incomplete decorrelation. Hence, second order or higher order 
differential coding algorithms may be adopted to further reduce 
the variance of data and to reduce correlativity. 
The essence of differential transform is equivalent to a 
reduction of the variance of gray values. The range of grayscale 
after the transform is so remarkably narrowed that the standard 
deviation is decreased from 23.05 (in Figure 3) to 3.25, which 
greatly cuts down the dispersion of the grayscale distribution. 
Moreover, a significant characteristic of normal distribution 
appears after the differential transform, as is shown in Figure 4. 
The purpose of differential encoding is to reduce the correlation 
among pixels to the maximum extent. This algorithm is easy 
and simple to implement which makes it worthy of application 
in satellite Earth data transmission to economize in resources 
and improve transmission efficiency. Besides, Deng and Lin 
(2009) applied this algorithm to the remote sensing images of 
“Beijing-1” micro-satellite. Compared with the experiments of 
DPCM, the effect of this algorithm is more favorable due to its 
decorrelation process which achieves a more than two times 
compression rate.
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.