s and V -
(4)
e dilation
uation is
(3)
shown in
shown in
(f)
to =r
In this implementation, the outputs demonstrate that the Gabor
wavelet can generate texture in different directions according to
people’s different needs for researches.
3. TEXTURE TRANSFORM BASED ON GAUSSIAN
MIXTURE MODELS(GMM)
3.1 GMM Theory
Recently, GMM, which has been widely used in speech
recognition, is applied in many domains of image processing.
Permuter et al. (2006) proposed a method combining multi-scale
wavelet with GMM to segment color textures.
In the possibility theory, a large number of independent random
variables which obey the same distribution will obey the single
Gaussian distribution on the whole according to the central limit
theorem. Consequently, single Gaussian is a widespread and
effective probability model in one-dimensional space. In fact,
any probability density function can be represented by a linear
combination of multiple single Gaussian models usually called
GMM.
In the description of texture features, GMM is used to represent
the possibility distribution of feature vector under different
imaging conditions. The probability density of an image with N
pixels is
= 6
pPUC) - Go) NOCs t, X.) e
ja
where X, the value of image gray under one imaging
condition, (i=1,2, ..., N)
m= the number of single Gaussian
a texture in j class
Pp): prior probability that is the weight of
J
texture in j class and meets Eq.(7)
S P(g) =1, Pg) >0 (7)
j=1
is a Gaussian of mean and covariance
N OX,5 4L, Y H,
Y given in Eq.(8)
J
(3)
TE 2 QG- up
erg]
A GMM can be modeled when get the parameters of mean E
j
covariance S and weight P(@) )-
J J
International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Volume XXXIX-B7, 2012
XXII ISPRS Congress, 25 August — 01 September 2012, Melbourne, Australia
3.2 Parameter Solution in GMM
EM (Expectation Maximization) algorithm is a widely used
maximum likelihood algorithm to solve the parameters in GMM.
When solving parameters using EM, the following two steps of
E and M should be done. First, calculate the expectation of
likelihood function that is E step. Second, M step is for
calculating the value of parameters.
(1) Initialize Gaussian parameters ( S ) and weight
Hn Y.
Pl) The posterior probability that the probability of
J
texture feature vector X, in class j can be represented in Eq.(9)
according to Bayesian theory
P@)NX 3 M, E) o
no o eme mo
TPN pT)
Combined with Eq.(6), Eq.(9) also can be represented as the
following
Poo X a Hs S.) (10)
P(e AX 7 PO)
(2) Update Gaussian parameters € , Y ) and weight
H, p
P(g)
Y Po X?
5 E no: (11)
PO),
: YPo|X)X. (12)
HU NP)
. L'eXXX-AXX-cHY ^ qq.
> NP(@)
(3) Make iteration of Eq.(10) to Eq.(13) until p(X)
restraining itself. Then get optimum evaluations of Gaussian
parameters ( on S 3 and weight p( 0)
3.3 Transformation Function
Virtually, the transformation of texture features description
under different imaging conditions can be implemented as a
process of establishing texture mapping. Each class of texture
features can be represented by a single Gaussian, so that the
transformation function F can be established based on GMM by
training texture feature vector Y; under one condition and Y,
under another. F is piecewise linear and given in Eq.(14)