Full text: Proceedings, XXth congress (Part 7)

International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Vol XXXV, Part B7. Istanbul 2004 
  
Supervised classification methods are trained on labeled data. 
As the number of bands increases the number of training data 
for classification is increased too. In usual the minimum 
number of training data for each class is 10N, where N is the 
number of bands (Swain and Davis, 1978). The details about 
the number of training pixels are shown in Table2. 
  
  
  
1 Training data 
Class Name (NO.of pixels) 
Wood 1290 
Grass paster 467 
Soybeans-notill 1108 
Corn 700 
Corn -notill 1527 
Hay windrowed 630 
Grass\trees 868 
Alfalfa 92 
Oatas 303 
Grass pastuer-moved 2645 
Soybeans-clean 710 
Corn-min 921 
  
  
  
Table2. Number of training data for classification 
Four statistical supervised classification methods are selected to 
test both PCA and the automatic wavelet reduction technique: 
Maximum Likelihood (ML), Mahalanibis distance (MB), 
Minimum Distance (MD) and Parallelepiped (PP). 
In this work we used an image of a portion of the Airborne 
Visible/Infrared Imaging Spectrometer (AVIRIS) of 
hyperspectral data taken over an agricultural area of California, 
USA in 1994 (figure4). This image has 195 spectral bands 
about 10nm apart in the spectral region from 0.4 to 2.45um 
with a spatial resolution of 20m.The test image has a pixel of 
145 rows by 145 columns. And its corresponding ground truth 
map is involving 12 class. The number of training pixel for 
each class is in Table2. 
  
Figure4. Test AVIRIS data. California 1994 
The overall classification accuracies obtained from both 
of dimension reduction methods are listed in Table 3. 
As shown in Table3 for ML algorithm the Wavelet reduction 
gives 95.73% overall accuracy for the first level of 
decomposition, while PCA only gives 95.3% .The same trend 
is seen for MB classification method and for all level of 
decomposition. The two other classification methods (MD and 
PP), are sometimes chose over the ML classification because of 
their speeds. Yet they are known to be much less accurate than 
the ML classification. Some authors believe that there are two 
main factors that make Automatic Wavelet Reduction 
outperform the PCA as follows (Kaewpijit er al, 2003). 
1) The nature of classifiers, which are mostly pixetbased 
techniques and are thus well suited for Wavelet, which is pixel 
based transformation. 
2) The lowpass and some of highpass portions of the remaining 
information content, not includes in the firsts PCs, are still 
present in the Wavelet reduced data 
  
  
  
  
  
  
  
  
  
  
  
  
  
  
Classification accuracy 
(%) 
Classification Method Reduction No. Of Component/Level of Decomposition 
Method 
101/1 54/2 30/3 18/4 12/5 
Maximum Wavelet 95.7356 92.5062 88.9342 84.6901 81.1451 
Likelihood PCA 95.3119 | 913712 7879337] 385.3308. 1 82 34436 
Mahalanobis Distance Wavelet 58.9236 58.6642 58.2423 57.4795 58.4189 
PCA 58.1104 56.9035 56.5256 33.3933 51.7098 
Minimum Wavelet 40.6104 40.5617 40.4415 40.6796 39.6672 
Distance PCA 40.5239 | 40.5140 | 40.5140 | 40.4842 | 40.4148 
Parallelepiped Wavelet 27.4137 27.0976 26.8934 26.8224 25.1447 
PCA 20.1573 21.1996 21.8945 22.1529 20.4247 
  
  
  
  
  
  
  
Table3. Classification result from comparing PCA and Wavelet Reduction 
64 
Interna 
The hig 
ability 
order t 
techniqi 
dimensi 
redundz 
signific 
charactt 
present 
hypersp 
With a 
sensors. 
similar 
explain 
spectral 
compre 
Wavele 
frequen 
class se 
problen 
howeve 
[1] Fuk 
Design, 
11, No : 
[2] Hsi 
High D 
Enginee 
[3] HS 
Analysi
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.