The value of a; that maximizes the average log likelihood is
selected.
Computation can be carried out for several values of a; the
value with the highest average log likelihood is selected.
Once appropriate value of o; is estimated, proposed covariance
matrix is substituted in MLC. Evaluation of probability density
function requires the inverse of the covariance matrix; it needs
to be non-singular.
Covariance matrix estimate can be singular if fewer than n +1
samples are only available.
But this covariance estimate will be non-singular as long as
sample covariance matrix has non zero diagonal elements,
which usually the case for more than sample.
Only constraint is the sample covariance matrix 2j of class i
without sample k
En = (/w)XGi mik) xj-mik) ' (5)
The term (N;-2) requires at least three samples in each class.
The covariance estimate will usually be non-singular with as
few as three training samples per class, regardless of the
dimension of the data.
13.5 Support Vector Machines: While Data reduction was the
focus in the earlier section the SVM concept deals with
situations where the number of features are small but the class
boundaries are complex. As we have seen earlier one way of
handling complex boundaries is by using ANN, an alternative
approach is to look for simpler boundaries in a higher
dimensional space which is created for that specific purpose
The SVM as a concept was well known in other areas like
character recognition etc. The original SVM is intended to
solve two-class problem and has been extended to handle multi
class problems.
Huang[4] has given a comprehensive comparison of SVM,
ANN and DTC and indicated that performance of SVM
improves as number of input bands are increased. The SVM
performed better than NNC when seven bands are used (See
Figure 9).
Gualtieri et al [6] have applied SVM method on AVIRIS data
for four classes and sixteen classes and accuracy of 96% and
87% is reported.
(a) Equal sample size, 7 variables
80 ; = : %
7
70
(5) Equal sample rate, 7 variables
f
65.18,
d
3 Lr EE
£ [OR
1 M IA i
0. 5: 60 8 |!
will Hi ui
55 pL LEG à at 55 nt N
2 4 6 8 Hy: 20 2 4 6 8T 40 2
[UsvM CNNC mDTC LMLC| [EisvM ONNC WDTC DMLC
(c) Equal sample size, 3 variables (d) Equal sample rate, 3 variables
80 mu TE is To sula
70 | : S
65 | iE Tr rm
AT E
351 77 7 . E
2 4 6 8 10. 20 2 4 6 8 10 20
ÍHSVM ONNC MDTC DMLC [BsvM ONNC WDTC HMLC|
Figure 9. Classification accuracy
IAPRS & SIS, Vol.34, Part 7, *Resource and Environmental Monitoring", Hyderabad, India,2002
14. CONCLUSION
Although MLC and ISODATA are widely used for
classification of remotely sensed data, NN and fuzzy classifiers
have been of interest in the recent past and are found to be quite
useful. With the advent of hyperspectral data need for
modifying the existing techniques or new techniques is well
recognized. Among recent methods SVM appear to hold lot of
promise. ;
It is pertinent to mention that although ANN models are useful
they are not available in many commercial image-processing
packages. Hence, the need of incorporating ANN plus other
techniques into commercial IP packages need not be over
emphasized. Availability of the methods in commercial off the
shelf software enables a cross section of user community to use
these techniques on variety of data sets.
15. REFERENCES
Reference from Books:
1. Brand Tso and Paul M Mathur Classification methods
for remotely sensed data
References from other literature:
2. Augustijin et_al , Neural Network classification and
novelty detection , IJRS Vol.23 No. 14, July 2002 pp
2891 - 2902
3. Cortijo et al RDA versus non-parametric classifiers
applied to high dimensional images IJRS,20 3345 —
6511999:
4. Huang C , et al An assesment of support vector
machines for land cover classification IJRS, Vol.23,
No.4 2002 pp725-749
5. Joseph P. Hoffbeck and Landgrebe, Covariance
matrix estimation and classification with limited
training data IEEE transactions on Pattern Analysis
and Machine Intelligence Vol.18, No.7, pp763 — 767,
July 1996
References from websites:
6. J Gualtieri et al Support Vector Machines for
Hyperspectral Remote Sensing Classification
available at
http://code935.gsfc.nasa.gov/code935/Hyperspectral/
Svm u.pdf
7. Luis O. Jimenez and Landgrebe High Dimensional.
Projection Pursuit available at
http://www.ece.purdue.edu/~landgreb/JimenezTR.pdf
8. Qiong Jackson and Landgerbe Design of Adaptive
Classification Procedure for the analysis of High
Dimensional Data with with a limited training
samples available at http://
www.ece.purdue.edu/-landgreb/JacksonTR.pdf
9. Shailesh Kumar et al A hierarchical Multiclassifier
system for Hyperspectral Data Analysis available at
http://citeseer.nj.nec.com/396983.html
10. www.vertice.org/yearbook/yb2000/Jasani.pdf
11. http://www.maths.uwa.edu.ou/~rkealley/ann_all/node
136.html.
AN (
KEY
remote
ABST
During
which
perfori
domin
measu
Often,
which
or mo
one m
water
mixed
Cox,
two le
sensin
which
chanc
to Oc
pixels
such ¢
one c
actual
provi
as Lii
algori
Basec
showt
classi
name
been
In es:
class
form
fuzzy
class:
explc
corre
and
inves
comp
> 8
prop:
FCM
Corre
83.4‘
L Co
Uni