A More Robust Mean Shift Tracker on Joint Color-CLTP Histogram

Full Text (PDF, 1327KB), PP.34-42

Views: 0 Downloads: 0

Author(s)

Pu Xiaorong 1,* Zhou Zhihu 1

1. School of Computer Science and Engineering, University of Electronic Science and Technology of China Chengdu, China

* Corresponding author.

DOI: https://doi.org/10.5815/ijigsp.2012.12.05

Received: 28 Jul. 2012 / Revised: 9 Sep. 2012 / Accepted: 15 Oct. 2012 / Published: 8 Nov. 2012

Index Terms

MeanShift, Object Tracking, Completed Local Ternary Pattern, Joint Color-CLTP Histogram

Abstract

A more robust mean shift tracker using the joint of color and Completed Local Ternary Pattern (CLTP) histogram is proposed. CLTP is a generalization of Local Binary Pattern (LBP) which can be applied to obtain texture features that are more discriminant and less sensitive to noise. The joint of color and CLTP histogram based target representation can exploit the target structural information efficiently. To reduce the interference of background in target localization, a corrected background-weighted histogram and background update mechanism are adapted to decrease the weights of both prominent background color and texture features similar to the target object. Comparative experimental results on various challenging videos demonstrate that the proposed tracker performs favorably against several variants of state-of-the-art mean shift tracker when heavy occlusions and complex background changes exist.

Cite This Paper

Pu Xiaorong,Zhou Zhihu,"A More Robust Mean Shift Tracker on Joint Color-CLTP Histogram", IJIGSP, vol.4, no.12, pp.34-42, 2012. DOI: 10.5815/ijigsp.2012.12.05 

Reference

[1]A. Yilmaz , O. Javed , and M. Shah, “Object Tracking: a Survey, ” ACM Computing Surveys, 38, (4), Article 13, 2006. 

[2]D. Comaniciu, V. Ramesh, and P. Meer, “Real-Time Tracking of Non-Rigid Objects Using Mean Shift, ” Proc. IEEE Conf. Computer Vision a nd Pattern Recognition, pp. 142-149 , 2000. 

[3]D. Comaniciu, V. Ramesh, and P. Meer, “Kernel-Based Object Tracking,” IEEE Trans. Pattern . Anal. Machine Intell., 25, (2), pp. 564-577, 2003. 

[4]J. Ning, L. Zhang, D. Zhang and C. Wu, “Robust Mean Shift Tracking with Corrected Background-Weighted Histogram,” IET Computer Vision, 2010. 

[5]J. Ning, L. Zhang, D. Zhang and C. Wu, “Robust Object Tracking using Joint Color-Texture Histogram,” International Journal of Pattern Recognition and Artificial Intelligence, vol. 23, No. 7 ,pp.1245–1263,2009. 

[6]G. Bradski, “Compuer vision face tracking for use in a perceptual user interface, ” Intel Technology Journal, 2(Q2) , 1998. 

[7]J. Ning, L. Zhang, D. Zhang and C. Wu, “Scale and Orientation Adaptive Mean Shift Tracking,” IET Computer Vision, 2011. 

[8]Q. A. Nguyen, A. Robles-Kelly and C. Shen, “Enhanced kernel-based tracking formonochromatic and thermographic video,” Proc. IEEE Conf . Video and Signal Based Surveillance, pp. 28–33,2006. 

[9]C. Yang, D. Ramani and L. Davis, “Efficient mean-shift tracking via a new similiarity measure,” Proc. IEEE Conf . Computer Vision and Pattern Recognition, pp. 176–183,2005. 

[10]I. Haritaoglu and M. Flickner, “Detection and tracking of shopping groups in stores,” Proc. IEEE Conf . Computer Vision and Pattern Recognition , Kauai, Hawaii, pp. 431–438, 2001. 

[11]T. Ojala, M. Pietikäinen, and T. T. Mäenpää, “Multiresolution gray-scale and rotation invariant texture classification with Local Binary Pattern,” IEEE Trans. on Pattern Analysis and Machine Intelligence, vol. 24, no. 7, pp. 971-987, 2002. 

[12]T. Ahonen, A. Hadid, and M. Pietikäinen, “Face recognition with Local Binary Patterns: application to face recognition,” IEEE Trans. on Pattern Analysis and Machine Intelligence, vol. 28, no. 12, pp. 2037-2041, 2006. 

[13]T.Ojala, M. Pietikainen, and D. Harwood, “A comparative study of texture measures with classification based on feature distributions,” Pattern Recognition, vol. 29, no. 1, pp. 51–59,1996. 

[14]T. Ojala, T. Mäenpää, M. Pietikäinen, J. Viertola, J. Kyllönen, and S. Huovinen, “Outex – new framework for empirical evaluation of texture analysis algorithm,” Proc. Inte’l. Conf. on Pattern Recognition, pp. 701-706, 2002. 

[15]T. Ojala and M. Pietikäinen, “Unsupervised texture segmentation using feature distributions,” Pattern Recognition, 32, pp.477-486,1999. 

[16]G. Zhao, and M. Pietikäinen, “Dynamic texture recognition using Local Binary Patterns with an application to facial expressions,” IEEE Trans. on Pattern Analysis and Machine Intelligence, vol. 27, no. 6, pp. 915-928, 2007. 

[17]X. Tan and B. Triggs. “Enhanced Local Texture Feature Sets for Face Recognition Under Difficult Lighting Conditions,” IEEE Trans. on Image Processing, 19(6): pp. 1635-1650, 2010. 

[18]Z. Guo, L. Zhang and D. Zhang, “A Completed Modeling of Local Binary Pattern Operator for Texture Classification,” IEEE Trans. on Image Processing, vol. 19, no. 6, pp. 1657-1663, June 2010. 

[19]S. Liao, M. K. Law and A. S. Chung, “Dominant Local Binary Patterns for Texture Classification,” IEEE Trans. on Image Processing, Vol. 18, No. 5, pages 1107 – 1118, May, 2009 

[20]L. Zhang, L. Zhang, Z. Guo and D. Zhang, “Monogenic-LBP : A new approach for rotation invariant texture classification,” Inte’l. Conf. on Image Processing, pp. 2677-2680, 2010 

[21]R.M.N Sadat and S. W. Teng, “Texture Classification Using Multimodal Invariant Local Binary Pattern," IEEE Workshop on Applications of Computer Vision, pp 315-320, 2011. 

[22]S. Zhao, Y. Gao, and B. Zhang, “Sobel-LBP," 15th IEEE International Conference on Image Processing, pp. 2144–2147, 2008. 

[23]“Pets2001: http://www.cvg.rdg.ac.uk/pets2001/,” . 

[24]S. Wang, H. Lu, F. Yang, M. Yang, "Superpixel Tracking," 13th International Conference on Computer Vision, pp. 1323-1330, 2011 

[25]A. Adam, E. Rivlin, and I. Shimshoni, "Robust fragments-based track-ing using the integral histogram," IEEE Inte’l. Conf. on Computer Vision and Pattern Recogintion, pp. 798–805, 2006. 

[26]W. Zhong, H. Lu and M.H. Yang, "Robust Object Tracking via Sparsity-based Collaborative Model, " IEEE Inte’l. Conf. on Computer Vision and Pattern Recogintion, 2012.