Professional Documents
Culture Documents
I. I NTRODUCTION
Gait, referring to the pattern of walking or locomotion,
has been used as an efcient biometric feature in human
identication [1]. Recently, with the growing demands
in recognition and classication of the side and longrange surveillance video, gait has become a hot research
topic. Gait information has more advantages than face
in these cases. X.Li et al.[2] summarized the limitations
of other biometric features, like face, ngerprint, iris,
and handwriting. Distance between camera / scanner and
people, people cooperation, Peoples attention impede the
acquisition of the traditional features to make identication
or classication.
Gait, is considered to contain gender information as
face does. X.Li et al. [2] described their analysis in
effectiveness of the seven human gait components for
gender recognition. We can see that all body segments
(head, arm, trunk, thigh, front-leg, back-leg, and feet)
have their contribution to gender classication. In [3] [4],
L.Lee et al. proposed a gait representation by tting each
ellipse to seven parts of silhouette, and features are the
parameters of these seven ellipses. In [5], J.H.Yoo et al.
used a sequential set of 2D stick gure to represent the gait
signature, then SVM classier was employed to classify
gender on a considerably large database, but the number
of males is much larger than that of females. As many
large gait database always have more males than females.
We have collected a gait database with similar number of
men and women which named IRIP Gait Database [6], and
used for doing research on recognition and classication.
S.Lee et al. [7] denes one gait cycle as the period
starting from a double support stance frame with left
foot forward to the next. As most of feature extraction
methods used in gait recognition and gender classication
(a)
2200
2100
2000
1900
1800
1700
1600
1500
1400
1300
10
20
30
40
Frame Index
50
60
70
80
40
50
Frame Index
60
70
80
(b)
1.5
1
Figure 1.
LLE Coeffcient
0.5
0
0.5
1
1.5
10
20
30
(c)
Figure 2. Silhouette frames and corresponding number of pixels (6th
person in IRIP Gait Database), LLE coefcients (a) 1st, 9th, 17th, 25th,
33rd, 41st, 49th, 57th and 65th silhouette frame (b)number of pixels in
Lower 53% portion (c)one dimensional LLE coefcient
crosscorrelation sequence
0.8
0.6
0.4
0.2
0
0.2
0.4
0.6
0.8
10
20
30
40
lag + 35
50
60
70
80
Mi
|Xi
=1
K
W Xi |
(2)
=1
Mi
=1
|Yi
K
W Yi |2
(3)
=1
Figure 3. is the auto-correlation sequence of LLE coefcients Yi in Figure 2.(c), from which we can gure out the
period clearly from the local maxima points. Because the
left leg forwarding half gait cycle and right one forwarding
half cycle are almost the same in the database, the span
between two local maxima is the number of frames over
half a period. So the second local maxima point in the
right half of the peak is the frames number in one period.
To test this algorithm, we apply it on IRIP Gait
Database, in which the number of frames in one period
ranges from 18 to 34. We test it on the 0 degree, 30 degree,
60 degree, 90 degree-1, 90 degree-2, 120 degree, 150
degree, 180 degree gait silhouette, and the experimental
results will be shown in the section 5.
The proposed gait representation method does not need
to assign full stride stance or heels together stance to be
the starting point. For some other gait representation needs
exactly the same starting and ending stance, maxima or
Figure 4. Gait silhouette image and GPCI: the rst row, 1st 5th 9th 13th
frame; the second row, 17th 21st 25th frame and GPCI of 28 frames in
a period
(4)
k=1
(5)
(6)
Yi min(Yi )
max(Yi ) min(Yi )
(7)
Figure 6.
Camera No.
C1
C2
C3
C4
C5
C6
C7
C8
Correct Rate
82%
99.33%
99.67%
100%
100%
100%
100%
91%
Incorrect Number
54
2
1
0
0
0
0
27
Period det.
LLE*
LLE*
Pixels num
Pixels num
Feature
GPCI*
GEI
Ellipse Fit
H&V Proj.
Classier
7NN
7NN
PCA+SVM
PCA+SVM
Correct rate
92.33%
92.00%
90.00%[6]
90.3%[19]
Table III
T EST RESULTS BY DIFFERENT GENDER CLASSIFICATION
ALGORITHMS ON IRIP G AIT DATABASE
Table I
S UMMARY OF THE PROPOSED LLE BASED PERIOD EXTRACTION
ALGORITHM
K=7
92.00%
92.33%
K=9
90.00%
92.33%
K=11
90.67%
91.00%
Table II
P ERFORMANCE OF THE PROPOSED GENDER CLASSIFICATION
ALGORITHM
In this paper, we propose a gait period extraction approach and a new representation for gender classication.
LLE and PCA were employed as the main method in the
process. We used Gait Principal Component Image (GPCI)
as the gait appearance feature for gender classication.
The experimental results show that GPCI is capable of
capturing the spatio-temporal information, and the proposed method has high classication accuracy.
In the future work, we will try to contrive other gait
representation for gender classication. And other dimension reduction techniques and classiers will be employed
to improve classication accuracy.
VII. ACKNOWLEDGEMENT
This work was supported by the opening funding of the
State Key Laboratory of Virtual Reality Technology and
Systems(Beihang University).
R EFERENCES
[1] A. Kale, A. Sundaresan, A. N. Rajagopalan, N. P. Cuntoor, A. K. Roy-Chowdhury, V. Kruger, and R. Chellappa,
Identication of humans using gait, IEEE Transactions
on image processing, vol. 13, no. 9, September 2004.
[2] X. Li, S. J. Maybank, S. Yan, D. Tao, and D. Xu, Gait
components and their application to gender recognition,
IEEE Transactions On Systems, Man, And Cybernetics-Part
C, vol. 38, no. 2, March 2008.
[3] L.Lee and W.E.L.Grimson, Gait analysis for recognition
and classication, In Proceedings of the Fifth IEEE International Conference on Automatic Face and Gesture
Recognition, 2002.
[4] L.Lee, Gait dynamics for recognition and classication,
Technical Report AIM-2001-019, September 2001.
[5] J. H. Yoo, D. Hwang, and M. S. Nixon, Gender classication in human gait with support vector machine, Advanced
Concepts for Intelligent Vision Systems, vol. 3708, pp. 138
145, 2005.
[6] D. Zhang and Y. Wang, Gender recognition based on
fusion of face and gait information, In IEEE Proceeding
of International Conference on Machine Learning and
Cybernetics, 2008.
[7] S. Lee, Y. Liu, and R. Collins, Shape variation-based frieze
pattern for robust gait recognition, In IEEE Proceedings
of CVPR, 2007.