Professional Documents
Culture Documents
2, MAY 1988
Abstract-In this paper a family of texture features that have the ability to evaluation of the entire system and techniques used is
discriminate different textures in a 3-D scene as well as the ability to recover presented in Section IV.
the range and orientation of the surfaces of the scene is presented. These
texture features are derived from the gray level run length matrices
(GLRLM's) of an image. The GLRLM's are first normalized so that they all 11. GLRLM NORMALIZATION
have equal average gray level run length. Features extracted from the
A . Gray Level Run Length Approach
normalized GLRLM's are independent of the surface geometry. Experiments
for the discrimination of natural textures have been conducted. The results A wide variety of features have been used for visual texture
demonstrated that these features have the ability to discriminate different analysis. Some of these feature sets have included features based
textures in a nontrivial 3-D scene.
on gray level run lengths, but these features have not
been used extensively. A gray level run is a set of
I. INTRODUCTION consecutive, collinear picture points having the same gray
A
pattern classification techniques for the purposes of three-
dimensional scene analysis.
COMPUTINGGLRLMS
B. Preprocessing
Fig. 1 illustrates the procedures we used for extracting the
features. First, video images are digitized from a camera, NORMALIZED FEATURES EXTRACTION
and those digitized images are preprocessed. The
computation of GLRLMs from the images follows. f
In our experiment, the only preprocessing employed was Image Features
the nonhomogeneous illumination adjustment or background Fig. 1. Flow chart of the features extraction
subtraction. For various reasons, most natural texture images
were not under a homogeneous or uniform illumination
normalizing factors (one for each direction considered) are
condition when the image was taken. Since our GLRLMs are
stored for use in the recovery of surface range and orien-
based on the pixels gray level, it was necessary to make such
tation.
adjustments to our images before computing our GLRLMs.
To obtain numerical texture measures from the GLRLMs,
Details of the preprocessing procedure can be found in [ 5 ] .
we compute functions analogous to those used by Haralik
[6] for gray level cooccurrence matrices. Some are similar to
C. Constructing GLRLMs those proposed by Galloway [7]. The following six features
After adjusting the nonhomogeneous illumination, the are widely used in our experiments; these features are all
GLRLMs were computed. Computation of these matrices is extracted from our GLRLMs.
relatively simple. First, we reduced the image size (e.g., 132 Some of the elements of our features are defined as
following:
* 132 in our experiment) so that all texture features were
preserved and the computation was simplified, because the P(i,j ) the (i,j ) t h entry in the given run length matrix.
number of calculations is in direct proportion to the number Ng: the number of gray levels in the image.
of points in the image. Nr: the number of different run lengths that occur (so
D. Computing ARL and Features from GLRLMs that the matrix is Ng by Nr).
In order to make the extracted textural features The following are all the features we used shown in order:
independent with the surface range and orientation, the 1) Second moment with respect to the run length (long
GLRLMs need to be normalized. To do so, we need first run emphasis)
compute the average gray level run length (ARL) of the
GLRLMs. As we shall see later, the ARL was one of the
most important properties of the GLRLMs used. Let P(i, j )
be the (i, j ) t h entry of a given GLRLM, Ngbe the (3)
number of gray levels in the image, and Nr be the length of
the longest run considered. The ARL can be computed as
color emphasis) determine the difference in the distance length between the
diagonal run lengths and other run lengths. Some details of
our experiments and the analysis of our experiment results
will be demonstrated in the next section.
111. EXPERIMENTATIONANDRESULTS
In this section we describe the details of our experiment.
The design of the experiments is overviewed. The results of
our experiment, including the basic analysis, are presented.
This function multiplies each run length value by the gray A . Experimental Setup
level squared. This should emphasize the bright color.
The experiments can be divided into four parts-perception
4) Gray level nonuniformity
distance, rotation, tilt and slant, and image texture classifica-
tions. The first three parts used the natural textures as
mentioned in Section 11. Those textures, shown in Fig. 2,
were selected from Brodatzs album [8]. The fourth part,
image texture classification, uses the features derived from
those natural textures.
/
In the perception distance part, the coarseness property can be
This function squares the number of run lengths for each well presented by the average run length of pictures, so that the
gray level. The sum of the squares is then divided by the ARLs in the picture decrease proportionally as the distance
normalization factor of the total number of runs in the between camera and the surface becomes longer. The
image. This measures the gray level nonuniformity of the relationship between the value of the ARLs and the perception
image. When runs are equally distributed throughout the distance between camera and the surface can be demonstrated in
gray levels, the function takes on its lowest values. High run the experimental results later in this section.
length values contribute most of the function. In our experiment we used several pictures that were being
5 ) Run length nonuniformity enlarged from an original picture in order to simulate the
effect of varying distance; the original picture together with
other pictures simulated from it became a set of testing
pictures. We may compute and normalize GLRLMs from
(7) those picture sets.
Using the line between the camera and the surface as an
/ axis, we can rotate the textural surface to a certain angle to
get pictures with different orientation. In the experiment we
This function squares the number of runs for each length. used pictures with the same texture, but with different
The sum of the squares is then divided by the normalizing rotation with respect to the camera line of sight, and
factor. This function measures the nonuniformity of the run computed their GLRLMs. We then extracted the features of
length. If the runs are equally distributed throughout the each GLRLM. Comparing these features, we found that the
lengths, the function will have a low value. Large run counts pictures direc-tional property can be preserved within the
contribute most to the function. GLRLMs. The angle rotated between these pictures can be
6) Sum of variance determined by analyzing their coordinate features.
Ng Nr Recovering the effect caused by surface tilt and slant is
F6= ((NF j )- U J 2 P(i,j) (8) probably the hardest part of our experiment. The tilted and
; = Ij = l slanted pictures are more complicated than the pictures with
different perception distances and orientation. All dimensions
where
of the projection of a texture element scale inversely with
distance. However, only the dimension in the direction of
greatest distance gradient (known as the direction of tilt) is
foreshortened when a surface is viewed obliquely; that
dimension is compressed by an amount equal to the cosine of
the angle between the surface normal and the line of sight
(known as the slant angle). Since the direction of tilt is
This function multiplies each run length value by the unknown, we will need to normalize the tilted surface from
square of the difference between its run and the average run all four directions. The features obtained from the normalized
length of its gray level. This function emphasizes the run GLRLMs provide a very good base in doing texture classifi-
length nonuniformity on each gray level base. When runs of cation. As shown later, the normalization greatly improves
the same gray level have the same length, the function the features quality compared to the features extracted before
should have the lowest values. normalization.
Note that when computing those features, all diagonal run
lengths should be multiplied by the square root of 2, in order to
326
IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS, VOL. 35, NO. 2,
Average
run length
(pixe Is)
(4
Fig. 2 Image textures used in
experiments.
9 -
8-
Feature
value
(at 0')
3 -
- 0
0
I/
Fig. 4. A typical feature value versus Uperception distance curve. (a) Before normalization. @) After normalization.
7
Feature
value
(at 0') 5
0
0'
Slant angle
(a)
Fig. 5. A typical feature vale versus slant angle curve.
0.1895+
0.18831
0.18711
0.18601
0.18481
0.1836+ b
0.18241
0.18121 b
0.18001 b
0.17881 b
0.1776+
0.17641
0.17531
Aa
a