Professional Documents
Culture Documents
Abstract
Moving object tracking in video scenes has attracted many researchers of
computer vision. Object tracking has a significant role in several important applications
such as Security and surveillance. This paper proposes an approach toward object
tracking in video scene. The proposed method handles partial as well as full occlusion of
object. Firstly, the user specifies a rectangle around the objects boundary in the reference
frame. Then, a Feature Vector (FV) for each pixel in the rectangle is constructed by using
the coefficients of complex wavelet transform. The search window is updated at each
frame based on the interframe texture analysis.
82
reduced by using complex wavelet transforms. There are different types of complex
wavelet transforms which are useful for tracking of the non-rigid objects in video data
like Dual Tree Complex Wavelet Transform [6], Projection based Complex Wavelet
Transform [7], Steerable Pyramid Complex Wavelet Transform [8] etc. All these
transforms suffer from high computational cost therefore we have chosen DTCxWT and
used it for tracking of objects in video sequences.
DTCxWT possesses the shift invariance property [9] and rotation invariance
[10] properties which are useful in segmentation and tracking of objects in different
scenes. Most researchers have used DTCxWT for tracking of rigid objects. Although in
some cases Daubechies Complex Wavelet Transform has also been used used [11]. For
many applications it is important that transform be invertible. Few authors, Lawton
(1993) and Belzier, Lina & Villasenor(1995), have done the experiments with the
standard polynomials and obtained perfect reconstruction (PR) of complex filters, which
still suffers from poor frequency selectivity. N.G. Kingsbury in 1998 has introduced a
complex wavelet transform called DTCxWT which gives Perfect Reconstruction and
other properties like shift-invariance and directional selectivity.
Rest of the paper is divided into three sections, introduction of complex
wavelets has been given in section 2, section 3 describes the proposed method and its
implementation. At last, the section 4 of this paper is describing the conclusions and
future scope.
1
T(a, b) =
a
Where
bt
dt
a
x(t ) *
* (t) is the complex conjugate of the mother wavelet function (t), a is the
dilation parameter and b is the location parameter of the wavelet. For complex wavelets
the Fourier transform must be real and vanish for negative frequencies. With smaller
redundancy, the wavelet transform which is approximately shift-invariant is explained by
Kingsbury. After that a more accurate shift-invariant transform with improved
orthogonality and symmetry properties had been proposed and explained. An example of
this decomposition in 1-D case is shown in figure 1. After the subsampling operation, the
filters of the first decomposition level correspond to a biorthogonal filter banks that
retains the even and odd components for tree a and tree b respectively. All filters from
83
second or higher levels are of even length. These trees correspond to the real and
imaginary components of the complex wavelet transform.
Figure 1: The dual-tree wavelet transform is implemented using two filter banks on same
data. The upper tree a represents the real and tree b represents the imaginary part of the
complex wavelet transform.
84
3. THE PROPOSED TRACKING METHOD:
3.1 OVERVIEW: The general overview of the proposed method is shown as in the
following block diagram:
85
where
wiH , wiV , wiD and wiA represent i th level complex wavelet coefficients at
References:
1.
2.
3.
4.
5.
6.
7.
86
8.
9.
10.
11.
12.
13.
14.
15.
16.
A.A. Bharath and J.Ng, A Steerable complex wavelet construction and its
applications to image denoising, IEEE Transactions on Image Processing,
vol.14, no.7, pp.948-959, 2005.
N G Kingsbury: Shift invariant Properties of the Dual-Tree Complex Wavelet
Transform, Proc. IEEE Conf. on Acoustics, Speech and Signal Processing,
Phoenix, AZ, paper SPTM 3.6, March 16-19, 1999.
N Kingsbury: Rotation-invariant local feature matching with complex
wavelets, Proc. European Conference on Signal Processing (EUSIPCO),
Florence, 4-8 Sept 2006, paper#1568982135.
A. Khare and U. S. Tiwary: Daubechies Complex Wavelet Transform Based
Moving Object Tracking, IEEE Symposium on Computational Intelligence in
Image and Signal Processing 2007. CIISP 2007, 36-40.
D.P Huttenlocher, J.J Noh, W.J Rucklidge,. Tracking non-rigid objects in
complex scenes, Computer Vision, 1993. Proceedings. Fourth International
Conference on Volume, Issue, 11-14 May 1993 pp:93 101.
D. Xu, J. Hwang, J. Yu, "An accurate region based object tracking for video
sequences," IEEE 3rd Workshop on Multimedia Signal Processing, pp. 271 276, 1999.
C. E. Erdem, A. M. Tekalp, B. Sankur, Video Object Tracking With Feedback
of Performance Measures, IEEE Transaction for Circuit and Systems for Video
Technology, Vol. 13, No. 4, pp. 310-324, 2003.
Y. Chen, Y. Rui, and T. S. Huang, JPDAF based HMM for real-time contour
tracking, in Proc. IEEE Int. Conf. Computer Vision and Pattern Recognition,
pp. 543550, 2001.
M. Khansari, H. R. Rabiee, M. Asadi, and M. Ghanbari, Object Tracking in
Crowded Video Scenes Based on the Undecimated Wavelet Features and
Texture Analysis, in EURASIP Journal on Advances in Signal Processing
Volume 2008 (2008), Article ID 243534, 18 pages doi:10.1155/2008/243534