You are on page 1of 12

Signal Processing 82 (2002) 461472

www.elsevier.com/locate/sigpro
Color image segmentation using fuzzy C-means and
eigenspace projections

Jar-Ferr Yang

, Shu-Sheng Hao, Pau-Choo Chung


Department of Electrical Engineering, National Cheng Kung University, 1, University Road, Tainan,
Taiwan, Republic of China
Received 7 September 2000; received in revised form 26 June 2001; accepted 16 November 2001
Abstract
In this paper, we propose two eigen-based fuzzy C-means (FCM) clustering algorithms to accurately segment the desired
images, which have the same color as the pre-selected pixels. From the selected color pixels, we can rst divide the color space
into principal and residual eigenspaces. Combined eigenspace transform and the FCM method, we can eectively achieve
color image segmentation. The separate eigenspace FCM (SEFCM) algorithm independently applies the FCM method to
principal and residual projections to obtain two intermediate segmented images and combines them by logically selecting their
common pixels. Jointly considering principal and residual eigenspace projections, we then suggest the coupled eigen-based
FCM (CEFCM) algorithm by using an eigen-based membership function in clustering procedure. Simulations show that
the proposed SEFCM and CEFCM algorithms can successfully segment the desired color image with substantial accuracy.
? 2002 Elsevier Science B.V. All rights reserved.
Keywords: Color image segmentation; Fuzzy C-means; Principal component transformation
1. Introduction
Wireless and Internet communications become
more and more popular for many multimedia infor-
mation retrievals. However, the bandwidth of wireless
and Internet networks varies due to the number of
users, access facilities, media types, and the amount
of data. In order to eciently use the available band-
width, the MPEG-4 video and audio compression

This research was partially supported by the National Sci-


ence Council under Contract #NSC-88-2213-E-006-104 and by
the Image=Graphics Technology Research and Application Devel-
opment Project of Institute for Information Industry sponsored by
MOEA, Taiwan, Republic of China.

Corresponding author. Tel.: +886-6-2763874; fax:


+886-6-2345482.
E-mail address: jfyang@ee.ncku.edu.tw (J.-F. Yang).
standards with exible, interactive and scalable ad-
vantages have been proposed for multimedia transmis-
sion and storage [14]. Instead of frame-based coding,
the MPEG-4 video could be designed with video ob-
ject planes, which are encoded or decoded separately
as users selection. In order to fully utilize the multi-
ple objects and interactivity features of MPEG-4 stan-
dards, the most important and dicult task is how to
eectively and eciently segment the desired imaged
such that we can encode, edit, index, and manipulate
them consequently. Hence, a good image segmenta-
tion algorithm will greatly advance the applications of
MPEG-4 standards. Furthermore, the color informa-
tion as well as the image shape in MPEG-7 is also im-
portant query parameters for image or video content
retrieval [15]. If we provide more accurate features
such as segment, size, shape, and number of pixels
0165-1684/02/$ - see front matter ? 2002 Elsevier Science B.V. All rights reserved.
PII: S 0165- 1684( 01) 00196- 7
462 J.-F. Yang et al. / Signal Processing 82 (2002) 461472
of the desired color image, we might help the search
engine to retrieve the desired image picture or video
sequence correctly.
Recently, many image segmentation algorithms
have been proposed according to motion [2], optic
ow [13], region edge [4], and color information
[3,6,9,11,1921,23,25,27,28]. Generally, the color
information appeared in the image provides an impor-
tant feature for human to cluster the desired objects.
Based on color information, many techniques includ-
ing fuzzy C-means (FCM) [19,29], neural network
[9,20], region growing and merging [21,25], edge
enhancement [6,23], color normalization [11], color
histogram and clustering [3,27,28] are proposed.
Recently, several researchers utilized the principal
component transformation (PCT) to successfully ex-
tract the desired parameters [5,8,24,30]. The color
image segmentation using the PCT approach requires
a manual selection of desired color samples. Al-
though the PCT methods exhibit good performances
in color image segmentation, we still need to face
the threshold and data smoothing problems. By using
iterative cluster technique, the fuzzy C-means (FCM)
methods [19,29,32] can automatically cluster the
color images without acquiring any prior threshold
setting.
The PCT-based segmentation approaches detect
the pixels, which have large projection along the
principal eigenvector of color components. Gener-
ally, the PCT concept gives the detected component
higher preference once it has larger projection to the
principal eigenvector. However, the strong projection
out of the normal range does not always imply that
the tested pixels have the same color as the desired.
To achieve better segmentation, we should use all
eigenspace projections. In this paper, we propose
two algorithms, which combine eigenspace projec-
tions and FCM concept together for color image
segmentation. In Section 2, we review the general
eigenspace projections and introduce the conventional
FCM algorithm for data clustering. In Section 3, we
rst propose the separated eigenspace FCM (SE-
FCM) algorithm, which applies the FCM algorithm
separately to the principal and residual eigenspace
projections. To achieve better performance, we then
suggest the coupled eigen-based FCM (CEFCM)
method by introducing an eigen-based membership
function directly embedded in the FCM algorithm. In
Section 4, some simulation results are shown to verify
the eectiveness of the proposed methods for color
image segmentation. Finally, a brief conclusion will
be addressed in Section 5.
2. PCT and FCM
The principal component transformation (PCT),
which is known as KarhunenLo eve (KL) transfor-
mation, can achieve the optimal energy compaction
for data representation [7]. For image segmenta-
tion, the PCT could help to identify the most likely
components [5,8,24,30,32]. First, the user is re-
quired to manually click the mouse to select sev-
eral blocks of the desired color image. Then, the
PCT-based segmentation algorithms will select the
color image, which possesses the same characteris-
tics of the selected block pixels. Let the kth sam-
ple pixel of the selected blocks in a color space be
represented as
x
k
= [x
k; 1
x
k; 2
x
k; 3
]
T
; (1)
where x
k; 1
, x
k; 2
, and x
k; 3
are the color components.
If we choose the RGB color space, then x
k; 1
, x
k; 2
,
and x
k; 3
represent red, green, and blue gray levels of
the kth sample pixel, respectively. From the selected
M samples, we rst compute the correlation matrix,

R
x
as

R
x
=
1
M
M

k=1
x
k
x
T
k
: (2)
After the eigen-decomposition, the correlation matrix
stated in (2) can be expressed by

R
x
=
3

i=1

i
w
i
w
T
i
; (3)
where
1

3
are eigenvalues enlisted in de-
scending order and w
i
for i = 1; 2, and 3 are the
corresponding eigenvectors. Hence, the eigenvec-
tor w
1
corresponding to the largest eigenvalue is
called the principal eigenspace. Statistically, the pro-
jection of the desired color pixels onto w
1
will be
the largest due to energy compaction by the KL
transform.
Since the desired color pixels of the image are
mostly concentrated on the direction of the principal
eigenspace, the PCT methods test all the color pixels
J.-F. Yang et al. / Signal Processing 82 (2002) 461472 463
by projecting them onto the principal component to
identify the desired color image. Once the projection
is larger, the tested color pixel will be closer to the
principal component. Thus, in the PCT approach, we
should nd a threshold of the projection to determine
whether the tested pixel is the desired one or not. The
performance of the PCT-based approach will be de-
graded if the threshold is not properly selected. Thus,
for the PCT-based segmentation algorithms, we need
further determine a threshold through statistical anal-
ysis of eigen-structures [5,8,24,30,32]. The KL trans-
formation, which computes all the projections of the
eigenvectors, provides the optimal data representation
in the mean square sense [7]. Due to the orthogo-
nal property of eigenvectors, the maximum projection
of a pixel onto the rst principal eigenvector physi-
cally will be identical to the minimum projection onto
the least eigenvectors. Hence, the eigenvectors asso-
ciated with the least eigenvalues also provide useful
information for signal modeling and parameter esti-
mation [12,17,18,22]. In this paper, we jointly con-
sider the projections of w
1
; w
2
, and w
3
for color image
segmentation. Hereafter, we further dene the resid-
ual eigenspaces, which are orthogonal to the principal
eigenspace w
1
, to be spanned by the eigenvectors w
2
and w
3
.
The fuzzy C-means (FCM) algorithm is an itera-
tive unsupervised clustering algorithm, which adjusts
group representative centers to best partition the image
into several distinct classes [1,10,31]. The clustering
process is accomplished by minimizing an objective
function, which is dened by some measure similar-
ity of the data samples. The objective function can be
expressed as follows:
J
m
(U; V; X) =
N

q=1
c

j=1
u
m
jq
dist
2
(x
q
; C
j
); (4)
where N is the number of the data samples, c is the
number of clusters and m is the arbitrary chosen FCM
weighting exponent, which must be greater than one.
In (4), X = {x
1
; x
2
; : : : ; x
N
} denotes a set of unla-
beled vectors and V = {C
1
; C
2
; : : : ; C
c
} represents the
unknown prototypes, which are known as the clus-
ter centers. The vectors, x
q
and C
j
, are both in
k-dimensional real Euclidean space R
k
. Hence, the
similarity measurement dist(x
q
; C
j
) in (4) can be
specied by either Euclidean distance or Mahalanobis
distance. The fuzzy c-partition matrix U = {u
m
jq
} is
with size of c N. The fuzzy membership value u
m
jq
,
which denes the belonging of x
q
to the jth cluster,
satises 0 6u
m
jq
61, for all j and q.
Unlike traditional classication algorithms, the
FCM algorithm assigns all test samples to each clus-
ter in fuzzy fashions. The fuzzy membership value
describes how close a sample resembles an ideal el-
ement of a population. The imprecision caused by
vagueness or ambiguity is characterized by the mem-
bership value. Inclusive of the concept of fuzziness,
the FCM algorithm computes each class center more
precisely and with higher robustness to the noise. To
normalize the membership function, we should make

c
j=1
u
m
jq
= 1 for all q and 0

N
q=1
u
m
jq
N for
all j. In Euclidean distance, the similarity measure,
dist(x
q
; C
j
) can be expressed as
dist
jq
= dist(x
q
; C
j
) =
_
k

=1
(x
q
v
j
)
_
1=2
; (5)
where k is the number of the feature parameters. In
Mahalanobis distance, the similarity measure can be
expressed by an inner product norm as
dist
jq
= dist
2
(x
q
; C
j
)
=||x
q
C
j
||
T
A
j
||x
q
C
j
|| =Q
T
j
A
j
Q
j
: (6)
In (6), A
j
is a k k positive dened matrix de-
rived from the jth cluster. When A
j
=I , Mahalanobis
distance becomes Euclidean norm. For m1 and
x
q
=C
j
, the minimization of the objective function de-
ned in (4) leads to
u
m
jq
=
(dist
jq
)
2=(m1)

c
i=1
(dist
iq
)
2=(m1)
for all j; q (7)
and
C
i
=

N
q=1
u
m
jq
x
q

N
q=1
u
m
jq
for all i: (8)
By iteratively updating the fuzzy membership with (7)
and the centroids with (8), the algorithm converges
to a local minimum of J
m
(U; V; X) [1]. The proce-
dures of the FCM algorithm [1,10] are enlisted as
follows:
Step 1: Initialization: Determine the number of
clusters, c and set the iteration loop index t = 1.
464 J.-F. Yang et al. / Signal Processing 82 (2002) 461472
We then randomly select the cluster centers as C
(0)
j
,
for j = 1; 2; : : : ; c, from the component space.
Step 2: Sampling: Select N data samples x
q
for
q = 1; : : : ; N from the image and compute the initial
membership function U
(0)
by using (5) and (7).
Step 3: Calculate new fuzzy cluster centers: Com-
pute all new cluster centers {C
(t)
} with U
(t1)
as stated
in (8).
Step 4: Update membership function U
(t)
: Compute
new similarity measures of x
q
for q = 1; : : : ; N with
respect to the new cluster centers {C
(t)
} by (5) and
update U
(t)
based on new similarity measure with (7).
Step 5: Check the convergence: Compute
=|C
(t)
C
(t1)
|: (9)
If then terminate; otherwise set t =t +1 and
go to Step 3, where is the preset terminating criterion.
In the above procedures, the superscript t for
U
(t)
and C
(t)
denotes the iteration number. Once the
change of the class centers during the iterations is
less than a predened criterion , the objective func-
tion J
m
(U; V; X) is treated as the convergence. In
this case, it is considered that the nal segmentation
result is achieved.
Schmid in [26] suggested an orientation sensitiv-
ity FCM algorithm (OS-FCM) by modifying A
j
de-
scribed in (6) as
A
j
=V
T
j
L
j
V
j
; (10)
where L
j
denotes the diagonal matrix containing the
inverse of eigenvalues and V
j
represents the unitary
matrix lining up the corresponding eigenvectors of the
fuzzy covariance matrix C
x
j
for the jth cluster. The
fuzzy covariance matrix for the jth cluster C
x
j
is given
by
C
x
j
=
1
N
N

q=1
u
m
jq
(x
q
x
T
q
C
j
C
T
j
): (11)
From simple matrix derivations, it is obvious that A
j
=
(C
x
j
)
1
.
3. Eigenspace FCM algorithms
The projection of the principal eigenspace exhibits
the likeness of the detected pixel matching to the se-
lected color space while the projection of the residual
eigensapces reveals the dierence from the desired.
Correlation
Matrix
Generation
Eigenspace
Transformation
Color
Image
FCM
on
Eigenspace
Projections
Color
Object
Segmentation
< ?
Desired
Color
Samples
Segmented
Color
Objects
Update Membership
Functions & Eigenvectors
N
Y
Initialization
Iterative Segmentation Process
Color Eigenspace
Formation
Fig. 1. The proposed eigenspace FCM algorithms.
To achieve eective color segmentation, we can com-
bine the FCMclassication with principal and residual
eigenspace projections together. Similar to the PCT
approaches, we rst use the eigenvectors of the cor-
relation matrix of desired color samples to transform
the original color space as
z
q
=Vx
q
= [z
q; 1
z
q; 2
z
q; 3
]
T
; (12)
where V = [w
1
w
2
w
3
] is formed by the eigen-
vectors. The rst element, z
q; 1
, for q = 1; : : : ; N
species the projection of the qth sample onto the
principal eigenspace while the second and the third
elements z
q; 2
and z
q; 3
represent the projections onto
the residual eigenspaces. With transformed color sam-
ples at hand, we then develop the separate eigenspace
FCM (SEFCM) and the coupled eigen-based FCM
(CEFCM) methods. As shown in Fig. 1, we rst com-
pute eigenvalues and eigenvectors of the correlation
matrix

R
x
from the manually selected color samples.
After eigenspace transformation, the iterative FCM
segmentation process with updated membership func-
tions will be applied to the transformed data. From
selected color samples, we only need to perform the
eigenspace transformation once for FCM processes.
If no desired colors are pre-selected, the correlation
matrix has to be estimated from the membership value
u
m
jq
as
R
x; j
=
_
_
1
_
N

q=1
u
m
jq
_
_
N

q=1
u
m
jq
x
q
x
T
q
(13)
J.-F. Yang et al. / Signal Processing 82 (2002) 461472 465
and the new eigenspaces for the jth class should be
computed in the FCM processes. The detailed descrip-
tions of the SEFCM and the CEFCM are addressed in
the following two subsections.
3.1. Separate eigenspace FCM (SEFCM) method
In the SEFCM, we propose two separate transforms,
which can partition tested pixels into principal and
residual eigenspaces by further considering their cor-
responding eigenvalues. After eigenspace transforma-
tion stated in (12), we did not directly pick z
q; 1
as
the principal component. For extracting the principal
eigenspace, we suggest the principal weighting matrix

j
as

P; j
=
_

_
(

j; 2
+
j; 3
2
)
1
0 0
0
1
j; 1
0
0 0
1
j; 1
_

_
: (14)
Statistically, the eigenvalue represents the energy
distribution of the corresponding eigenvector in the
selected color samples. If
j; 1
is much larger than

j; 2
and
j; 3
, it implies that the selected pixels mostly
come from the principal component only. In this case,
we can use
P; j
to further weight the transformed
pixel such that we can intensively raise the value of
z
q; 1
and suppress the values of z
q; 2
and z
q; 3
. If
j; 1
is close to
j; 2
and
j; 3
, it means the selected pix-
els cannot exhibit any signicant component. In this
case, the principal weighting matrix will not rela-
tively enhance z
q; 1
so radically as the previous case.
Thus, depending on distribution of eigenvalues, we
can use (14) to extract the principal eigenspace ro-
bustly. On the contrary, we can extract the residual
eigenspaces by using the residual weighting matrix
as

R; j
=
_

1
j; 1
0 0
0 (

j; 2
+
j; 3
2
)
1
0
0 0 (

j; 2
+
j; 3
2
)
1
_

_
: (15)
Similarly, since the eigenvalue
j; 1
is larger than
j; 2
and
j; 3
we can robustly select the residual compo-
nents, z
q; 2
and z
q; 3
by the weighting matrix,
R; j
.
By referring (6) and (10), the SEFCM membership
function for the principal component depicted in (7)
Desired Color Samples
Color Eigenspace
Formation
C
o
l
o
r

E
i
g
e
n
s
p
a
c
e
P
r
o
j
e
c
t
i
o
n
s
Logical
Operation
Segmented
Color
Objects
Color
Image
Initialization
Initialization
Segmented
Color
Objects
Principal
weighted
FCM Process on
Principal-weighted Planes
Residual
weighted
FCM Process on
Residual-weighted Planes
Segmented
Color
Objects
Fig. 2. Signal ow diagram of the proposed SEFCM algorithm.
becomes
u
m
jq
=
[(x
q
C

j
)
T
A
P; j
(x
q
C

j
)]
2=(m1)

c
i=1
[(x
q
C

i
)
T
A
P; i
(x
q
C

i
)]
2=(m1)
=
[(z
q
C
j
)
T

P; j
(z
q
C
j
)]
2=(m1)

c
i=1
[(z
q
C
i
)
T

P; i
(z
q
C
i
)]
2=(m1)
; (16)
where A
P; j
= V
T
j

P; j
V
j
and A
P; i
= V
T
i

P; i
V
i
as (6)
are the transform and weighted matrices of classes j
and i, respectively. In (16), we have z
q
= V
i
x
q
and
C
j
= V
j
C

j
, where C

j
and C
j
denotes the jth cluster
centers in the original and transform domains, re-
spectively. To achieve (16), we should rst perform
the PCT as z
q
=V
i
x
q
. By using the principal weight-
ing matrix, we can compute the principal-weighted
transform as
z
P; q
=
1=2
P; j
z
q
; q = 1; : : : ; N: (17)
Secondly, the traditional FCM algorithm is then ap-
plied to the principal transformed samples {z
P; q
}
to obtain the segmented image. Similarly, we can
achieve the residual-weighted transform as
z
R; q
=
1=2
R; j
z
q
; q = 1; : : : ; N: (18)
Then, the traditional FCM algorithm is applied to the
residual transformed samples {z
R; q
} to acquire an-
other segmented image. Finally, we can apply a sim-
ple logical AND operation to these two extracted
images to obtain the SEFCM results. As shown in
Fig. 2, the detailed procedures of the SEFCM are
466 J.-F. Yang et al. / Signal Processing 82 (2002) 461472
stated as follows:
Step 1: Manually select a few desired color object
blocks from the image.
Step 2: Compute the correlation matrix and obtain
the eigenvetors according to (2).
Step 3: Transformthe color images into eigenspaces
by using (12) to obtain Z ={z
q
; q = 1; : : : ; N}.
Step 4: Compute the principal weighted trans-
formed samples by using (17) to achieve Z
P
=
{z
P; q
; q = 1; : : : ; N}. and perform the traditional FCM
algorithm procedure to minimize J
m
(U
P
; V
P
; Z
P
) to
segment the desired image.
Step 5: Compute the residual weighted trans-
formed samples by using (18) to acquire Z
R
={z
R; q
;
q = 1; : : : ; N} and perform the traditional FCM al-
gorithm procedure to minimize J
m
(U
R
; V
R
; Z
R
) to
segment the desired image.
Step 6: Perform logical AND operation to ex-
tract the coexisted pixels from the segmented im-
ages obtained from Steps 4 and 5. The result af-
ter logical AND will be the nally segmented
image.
After the FCM classication in Steps 4 and 5, each
pixel has been marked a specic class number in
the segmented image. If the segmentation process is
started from a manual selection of the desired pix-
els from the entire image, we can easily identify the
desired image by checking whether the segmented
image contains the selected pixels or not. If the seg-
mentation is started with the selected color informa-
tion without location of the desired pixels, we can pick
the segmented image whose the cluster center, C

j
is
closest to the principal component,

1
w
1
. In other
words, we choose the jth class as the desired image if
dist(

1
w
1
; C
j
) 6dist(

1
w
1
; C
i
) for all i. Once the
pixels of the desired image are identied, we then
mark the selected pixels in the jth class with 1s and
the remaining with 0s. Finally, in Step 6, we can use
logic AND of the binary images obtained in Steps
4 and 5 to acquire the nal segmented result, where
1s and 0s denote the detected and unwanted pixels,
respectively.
3.2. Coupled eigen-based FCM (CEFCM) method
In order to precisely segment the desired images,
we further propose the coupled eigen-based FCM
Updated
Correlation
Matrix
(See Eq. (13))
Desired
Color
Samples
C
o
l
o
r

E
i
g
e
n
-
s
u
b
s
p
a
c
e

F
o
r
m
a
t
i
o
n
Color
Image
Initialization
Segmented
Color
Objects
Iterative FCM
Segmentation Process
With
Eigen-based Distance Measure
(Stated in (21))
Post-
processing
(Optional)
Fig. 3. Signal ow diagram of the proposed CEFCM algorithm.
(CEFCM) algorithm. Considering principal and resid-
ual subspaces together, the CEFCM directly adopts
three-dimensional eigenspaces for classication. The
function block diagram of the CEFCM is shown in
Fig. 3. During iterative FCM procedure, we also adopt
the updated correlation matrix as (13). For the jth
class correlation matrix R
x; j
, we can obtain eigenvec-
tor w
j; i
and the corresponding eigenvalue
j; i
, with
the descending order as

j; 1

j; 2

j; 3
: (19)
Statistically, the jth cluster center can be expressed by
the principal component as
C
j
=
_

j; 1
w
j; 1
: (20)
Since the principal eigenvecor is the best projection
vector and its corresponding eigenvalue is the aver-
age projection length in the mean square sense. The
second and the third components can be treated as the
spreading terms in the class. To design a new simi-
larity measure to the jth cluster center, we can use all
three eigenspace projections together. For any sam-
ple, x
q
, whose three eigenspace projections satisfy
||x
T
q
w
j; 1
||
_

j; 1
, ||x
T
q
w
j; 2
|| 0, and ||x
T
q
w
j; 2
||
0, we can say that this sample is very close to the
jth cluster center, i.e.,
_

j; 1
w
j; 1
. Considering the nor-
malization, we can design an eigen-based distance
measure as
dist
jq
= dist
3
(x
q
; C
j
)
=
1

j; 2
||x
T
q
w
j; 2
||
2
+
1

j; 3
||x
T
q
w
j; 3
||
2
J.-F. Yang et al. / Signal Processing 82 (2002) 461472 467
+
1

j; 1
(||x
T
q
w
j; 1
||
2

j; 1
)
=
1

j; 2
||z
q; j; 2
||
2
+
1

j; 3
||z
q; j; 3
||
2
+
1

j; 1
(||z
q; j; 1
||
2

j; 1
); (21)
where z
q; j; i
=x
T
q
w
j; i
denotes the projection of x
q
onto
the ith eigenvector of the jth class. With the above
eigen-based distance measure, we can use (7) to
compute the eigen-based membership function easily.
The detail procedures for the CEFCM are listed as
follows:
Step 1: Manually sample a few desired color sam-
ples from the image.
Step 2: Compute the correlation matrix and obtain
the eigenvetors and eigenvalues.
Step 3: Initialize the class center C
j
=
_

j; 1
w
j; 1
for
selected color samples and the membership function
by using the eigen-based distance measure stated in
(21). (Repeat Steps 13 if more than one color are
desired).
Step 4: Randomly select the remaining class cen-
ters, C
i
, which should satisfy
1

j; 1
(||C
T
i
w
j; 1
||
2

j; 1
)
to assure the dummy color centers away from the ini-
tialized class center(s) in Step 3. Initialize the mem-
bership function by using Euclidean distance measure
stated in (5).
Step 5: Update the new correlation matrix (13) and
compute the eigen-based membership function until
the FCM procedure converges.
4. Simulation results and discussion
In order to verify the eectiveness of the pro-
posed algorithms, we adopt four test images, Mosaic,
Ball, Akiyo, and News, for simulations. As shown in
Fig. 4, we put a triangular mark on each image to
indicate the desired color image we want to extract.
Specially, Mosaic test image with isolated color tiles
could be treated a synthetic image, which helps us to
verify the accuracy of the segmentation. It is noted that
the desired color tile in Mosaic test image contains a
dark spot and a light scratched line near the upper right
corner. Because we do not introduce any spatial and
temporal information in our algorithms, all the other
images with the same color will be appeared in the
segmented results. According to the FCM property,
the other color images will be automatically classied
into dierent clusters as well. To clearly exhibit the
desired color in the segmented images, we only show
the desired color and undesired color objects by bright
pixels with gray level =255 and dark pixels with gray
level = 0, respectively.
First, we apply the PCT to transform the image
from the original color space into the eigenspace of the
selected pixels. By using the threshold determination
method suggested in [16], we obtain four segmented
images as shown in Fig. 5. Although the main parts
of the desired object are extracted, some missing or
extra pixels unexpectedly come out due to threshold
sensitivity.
After we apply the conventional FCM to these
four images in the RGB color space, the segmented
results are shown in Fig. 6. Without any threshold
determination, the desired objects obtained by the
traditional FCM are better than those obtained by the
PCT method. However, the segmented results still
contain many undesired pixels. For Mosaic image,
the FCM method gives a correct detection of the de-
sired color. However, the FCM incorrectly detected
extra two more tiles, which are on the upper-right and
lower-left corners.
Fig. 7 shows the simulation results by applying
the conventional FCM on the transformed eigenspace.
Comparing to Fig. 6, some improvements are achieved
because the projected data has been re-distributed af-
ter PCT projections. For Mosaic image, the lower-left
tile has been correctly removed but the upper-right tile
still exists.
For comparisons, we also apply the OS-FCM algo-
rithm [26] to extract the desired images. Fig. 8 shows
the segmented images obtained from the OS-FCM
method. Although all the main objects can be de-
tected, most similar color pixels are also extracted.
For Mosaic image, the lower-left tile has been
correctly removed but the upper-right tile still
exists.
By combining eigen-projection and FCM algo-
rithms, Fig. 9 shows the segmented results ob-
tained by the SEFCM. We found that the undesired
pixels have been mostly removed. The SEFCM
can correctly segment the desired color tile from
468 J.-F. Yang et al. / Signal Processing 82 (2002) 461472
Fig. 4. Test images: (a) Mosaic; (b) Ball; (c) Akiyo; (d) News.
Fig. 5. Segmented images obtained by the PCT method using threshold and logical operation: (a) Mosaic; (b) Ball; (c) Akiyo;
(d) News images.
J.-F. Yang et al. / Signal Processing 82 (2002) 461472 469
Fig. 6. Segmented images obtained by the conventional FCM on RGB planes: (a) Mosaic; (b) Ball; (c) Akiyo; (d) News images.
Fig. 7. Segmented images obtained by the conventional FCM on eigenspaces: (a) Mosaic; (b) Ball; (c) Akiyo; (d) News images.
470 J.-F. Yang et al. / Signal Processing 82 (2002) 461472
Fig. 8. Segmented images obtained by the OS-FCM method on eigenspaces: (a) Mosaic; (b) Ball; (c) Akiyo; (d) News images.
Fig. 9. Segmented images obtained by the SEFCM: (a) Mosaic; (b) Ball; (c) Akiyo; (d) News images.
J.-F. Yang et al. / Signal Processing 82 (2002) 461472 471
Fig. 10. Segmented images obtained by the CEFCM: (a) Mosaic; (b) Ball; (c) Akiyo; (d) News images.
Mosaic test image. However, the SEFCM still can-
not correctly segment the color skin from Akiyo
image.
In order to improve the performance, the CEFCM
algorithmjointly adopts the principal and residual sub-
space projections together. Fig. 10 shows the seg-
mented images obtained from the CEFCM method.
It is obvious that the CEFCM method outperforms
the other color image segmentation algorithms. The
CEFCM achieves more accurate segmentation for all
test images.
5. Conclusions
In this paper, we proposed two color segmenta-
tion algorithms by combining the PCT and FCM
methods together to extract the desired color images.
Considering both principal and residual projections,
both the SEFCM and CEFCM methods show better
performance for color object segmentation than the
PCT-based methods or the FCM approaches. The
proposed SEFCM and CEFCM methods are more
robust and less susceptible to the noise but require
higher computation than the existed algorithms. The
SEFCM method, which is with less computation than
the CEFCM, disjointedly applies the FCM method
to both principal-weighted and residual-weighted
eigenspaces so that it extracts the nal desired color
image by logic AND of common pixels. The CE-
FCM approach with the suggested eigen-based mem-
bership function achieves more precise color image
extraction than the SEFCM. The proposed method
only considers the color information. Considering
spatial and temporal information, we believe that
our proposed algorithms can be further improved for
color image segmentation.
References
[1] J.C. Bezdek, Pattern Recognition with Fuzzy Objective
Function Algorithm, Plenum, New York, 1981.
[2] M.M. Chang, A.M. Tekalp, M.I. Sezan, Simultaneous Motion
Estimation and Segmentation, IEEE Trans. Image Process. 6
(9) (September 1997) 13261333.
472 J.-F. Yang et al. / Signal Processing 82 (2002) 461472
[3] D. Chai, K.N. Ngan, Face segmentation using skin-color map
in videophone applications, IEEE Trans. Circuits Systems
Video Technol. 9 (4) (June 1999) 551564.
[4] C.-C. Chu, J.K Aggarwal, The integration of image
segmentation maps using region and edge information, IEEE
Trans. Pattern Anal. Mach. Intell. 15 (12) (December 1993)
12411252.
[5] R.D. Dony, S. Haykin, Image segmentation using a mixture
of principal components representation, IEE Proc. Vision,
Image Signal Process. 144 (April 1997) 7380.
[6] C. Garcia, C. Georgios, Face detection using quantized skin
color regions merging and wavelet packet analysis, IEEE
Trans. Multimedia 1 (3) (September 1999) 264277.
[7] R.C. Gonzalez, P.A. Wintz, Digital Image Processing,
Addison-Wesley, Reading, MA, 1992, p. 153.
[8] A. Guzman de Leon, J.F. Lerallut, J.C. Boulanger,
Application of the principal components transform to
colposcopic color images, IEEE 17th Annual Conference
on Engineering in Medicine and Biology Society, Vol. 1,
2023 September 1995, pp. 511512.
[9] G.A. Hance, S.E. Umbaugh, R.H. Moss, W.V. Stoecker,
Unsupervised color image segmentation: with application to
skin tumor borders, IEEE Eng. Med. Biol. Mag. 15 (1)
(JanuaryFebruary, 1996) 104111.
[10] S. Haykins, Neural NetworksA Comprehensive Foundation,
2nd Edition, Prentice-Hall, Englewood Clis, NJ, 1999.
[11] G. Healey, Segmentation images using normalized color,
IEEE Trans. Systems, Man, Cybern. 22 (1) (1992) 6473.
[12] B. Hu, R.G. Gosine, A new eigenstructure method for
sinusoidal signal retrieval in white noise: estimation and
pattern recognition, IEEE Trans. Signal Process. 45 (12)
(December 1997) 30733083.
[13] Y. Huang, K. Palaniappan, X. Zhuang, J.E. Cavanaugh, Optic
ow eld segmentation and motion estimation using a robust
genetic partitioning algorithm, IEEE Trans. Pattern Anal.
Mach. Intell. 17 (12) (December 1995) 11771190.
[14] ISO=IEC JTC1=SC29=WG11, Information Technology
Coding of Audio-Visual Objects, Part 2: MPEG-4 Visual,
14496-1, October 1998.
[15] ISO=IEC JTC1=SC29=WG11, Multimedia Content
Description Interface, Part 3: MPEG-7 Visual, 15938-1,
March 2001.
[16] M. Kaveh, A.J. Barabell, The statistical performance of
the MUSIC and the minimum-norm algorithms resolving
plane waves in noise, IEEE Trans. Acoustics, Speech, Signal
Process. ASSP-34 (2) (April 1986) 331341.
[17] I.B. Kerfoot, Y. Bresler, Theoretical analysis of multispectral
image segmentation criteria, IEEE Trans. Image Process. 8
(6) (December 1997) 798820.
[18] J.H. Lee, B.H. Chang, S.D. Kim, Comparison of colour
transformation for image segmentation, Electron. Lett. 30
(20) (September 1994) 16601661.
[19] Y.W. Lim, S.U. Lee, On the color image segmentation
algorithm based on the thresholding and fuzzy C-means
techniques, Pattern Recognition 23 (9) (1990) 935952.
[20] E. Littmann, H. Ritter, Adaptive color segmentationa
comparison of neural and statistical methods, IEEE Trans.
Neural Networks 8 (1) (January 1997) 175185.
[21] A. Moghaddamzadeh, N. Bourbakis, A fuzzy region
growing approach for segmentation of color images, Pattern
Recognition 30 (6) (1997) 867881.
[22] H. Murase, S.K. Nayar, Illumination planning for object
recognition using parametric eigenspaces, IEEE Trans.
Pattern Anal. Mach. Intell. 16 (12) (December 1994)
12191227.
[23] E. Saber, A.M. Tekalp, G. Bozdagi, Fusion of color and
edge information for improved segmentation, Image Vision
Comput. 15 (1997) 680769.
[24] E. Sahouria, A. Zakhor, Content analysis of video using
principal components, IEEE Trans. Circuits Systems Video
Technol. 9 (8) (December 1999) 12901298.
[25] R. Schettini, A segmentation algorithm for color image,
Pattern Recognition Lett. 14 (1993) 499506.
[26] P. Schmid, Segmentation of digitized dermatoscopic images
by two-dimensional color clustering, IEEE Trans. Med.
Imaging 18 (2) (February 1999) 164171.
[27] L. Shafarenko, H. Petrou, J. Kittler, Histogramspace
segmentation in a perceptually uniform color space,
IEEE Trans. Image Process. 7 (9) (September 1998)
13541358.
[28] X. Wan, C.J. Kuo, A new approach to image retrieval with
hierarchical color clustering, IEEE Trans. Circuits Systems
Video Technol. 8 (5) (September 1998) 628643.
[29] H. Wu, Q. Chen, M. Yachida, Face detection from color
images using a fuzzy pattern matching method, IEEE Trans.
Pattern Anal. Mach. Intell. 21 (6) (June 1999) 557563.
[30] J. Xiuping, J.A. Richards, Segmented principal components
transformation for ecient hyperspectral remote-sensing
image display and classication, IEEE Trans. Geosci. Remote
Sensing, Part-2 37 (1) (January 1999) 538542.
[31] X.L. Xie, G. Beni, A validity measure for fuzzy clustering,
IEEE Trans. Pattern Anal. Mach. Intell. 13 (8) (August
1991) 841847.
[32] J.F. Yang, S.S. Hao, P.C. Chung, C.L. Huang, Color object
segmentation with eigenspace fuzzy C-means, Proceedings
of IEEE International Symposium on Circuits and Systems,
Geneva, Switzerland, May 2000, pp. V.25V.28.

You might also like