In a square raster digital image, each pixel is surrounded by
eight neighbouring pixels. The local texture information for a
pixel can be extracted from a neighbourhood of 3 x 3 pixels,
which represents the smallest complete unit (in the sense of
having eight directions surrounding the pixel).
Given a neighbourhood of 3x3 pixels, which will be denoted
by a set containing nine elements: V= {Vo, Vi, ..., V8),
where Vo represents the intensity value of the central pixel
and Vi, {i=l, 2, ..., 8}, is the intensity value of the
neighbouring pixel i, we define the corresponding texture unit
by a set containing eight elements, TU={Ei, E2, ..., Es},
where Ei, (i=l, 2,..., 8) is determined by the formula:
{ 0 if Vi<Vo
1 if V i= V 0 for: i=l,2,..., 8
2 if Vi>Vo
and the element Ei occupies the same position as the pixel i.
As each element of TU has one of three possible values, the
combination of all the eight elements results in 3 8 =6561
possible texture units in total.
There is no unique way to label and order the 6561 texture
units. In our study, the 6561 texture units are labeled by
using the following formula:
8
NTU = ^ Ei* 3 1 - 1
i=l
where Ntu represents the Texture Unit Number and Ei is the
ith element of texture unit set TU={Ei, E2,..., Es}.
The previously defined set of 6561 texture units describes the
local texture aspect of a given pixel, that is, the relative grey
level relationships between the central pixel and its
neighbours. Thus, the statistics on frequency of occurrence of
all the texture units over a large region of an image should
reveal texture information. We termed texture spectrum the
frequency distribution of all the texture units, with the
abscissa indicating the texture unit number Ntu and the
ordinate representing its occurrence frequency.
It should be noted that the labeling method chosen may affect
the relative positions of texture units in the texture spectrum,
but will not change their frequency values in the texture
spectrum. II
II should be also noted that the local texture for a given pixel
is characterized by the corresponding texture unit, while the
texture aspect for an uniform texture image is revealed by its
texture spectrum calculated within an appropriate window.
The size of the window depends on the nature of the texture
image.
3 TEXTURAL EDGE DETECTION
The principal idea of textural edge detection is to use texture
spectrum as the texture feature and to combine it with
traditional operators. That is, when applying a traditional edge
detection operator over an image, the grey level of each
element of the operator will be replaced by the texture
spectrum calculated from the corresponding neighbourhood.
Some texture images have been choosen to evaluate the
method. They are illustrated in Figures 1 and 2. These images
are represented by 512x512 pixels with 64 normalized grey
levels and are composed of six different textured areas. These
six texture images are extracted from Brodatz' album
(Brodatz, 1968). They are respectively the images D4:
pressed cork; D24: pressed calf leather; D29: beach sand;
D38: water; D57: handmade paper and D93: fur hide of
unborn calf. These natural texture images have been chosen
because they are broadly similar to one another, and are
similar to parts of digital images usually encountered in
practice, for example, landscape scenes provided by earth
observation satellites.
The six textures of Figure 1 constitute one vertical boundary
and four diagonal ones, while Figure 2 presents four circles
with two straight lines (horizontal and vertical). This may
represent most of the complex situations encountered in
natural images.
In our study, the Roberts operator was used as the edge
detection operator. Texture spectra were calculated using a
moving window of 30 x 30 pixels. The integrated absolute
difference between two texture spectra has been taken as the
difference between two elements of the edge detection
operator:
R(oberts) = V d] + d 2 2
6561
Dl= I Si J(k) “Si+l,j+l(k)l
k=l
D2 = 6 ^ISij + i(k)-Si + i,j(k)l
k=l
where: Sij(k) denotes the kth element of the texture spectrum
calculated from the window located at the position (i, j).
Thus, Di and D2 give the absolute difference between two
texture spectra located in diagonal positions.
The convolution using the above operator was carried out
over the whole images of Figures 1 and 2 with a step of 1 in
line and column. The results are illustrated respectively in
Figures 3 and 4, where the grey levels of the images are
linearly stretched from 0 to 250.