Robust Real-time Tracking Guided by Reliable Local Features

Marcos D. Zuniga, Cristian M. Orellana


This work presents a new light-weight approach for robust real-time tracking in difficult environments, for situations including occlusion and varying illumination. The method increases the robustness of tracking based on reliability measures from the segmentation phase, for improving the selection and tracking of reliable local features for overall object tracking. The local descriptors are characterised by colour, structural and segmentation features, to provide a robust detection, while their reliability is characterised by descriptor distance, spatial-temporal coherence, contrast, and illumination criteria. These reliability measures are utilised to weight the contribution of the local features in the decision process for estimating the real position of the object. The proposed method can be adapted to any visual system that performs an initial segmentation phase based on background subtraction, and multi-target tracking using dynamic models. First, we present how to extract pixel-level reliability measures from algorithms based on background modelling. Then, we present how to use these measures to derive feature-level reliability measures for mobile objects. Finally, we describe the process to utilise this information for tracking an object in different environmental conditions. Preliminary results show good capability of the approach for improving object localisation in presence of low illumination.


  1. Adam, A., Rivlin, E., and Shimshoni, I. (2006). Robust fragments-based tracking using the integral histogram. In Computer Vision and Pattern Recognition, 2006 IEEE Computer Society Conference on, volume 1, pages 798-805.
  2. Alahi, A., Ortiz, R., and Vandergheynst, P. (2012). Freak: Fast retina keypoint. In Procedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR 2012), pages 510-517.
  3. Bhattacharyya, A. (1943). On a measure of divergence between two statistical populations defined by probability distributions. Bulletin of the Calcutta Mathematical Society, 35:99-110.
  4. Breitenstein, M., Reichlin, F., Leibe, B., Koller-Meier, E., and Van Gool, L. (2009). Robust tracking-bydetection using a detector confidence particle filter. In Computer Vision, 2009 IEEE 12th International Conference on, pages 1515-1522.
  5. Correia, P. L. and Pereira, F. (2003). Objective evaluation of video segmentation quality. Image Processing, IEEE Transactions on, 12(2):186-200.
  6. Erdem, C¸ . E., Sankur, B., et al. (2004). Performance measures for video object segmentation and tracking. Image Processing, IEEE Transactions on, 13(7):937- 951.
  7. Kalal, Z., Matas, J., and Mikolajczyk, K. (2011). Tracking learning detection. IEEE Transactions on Pattern Analysis and Machine Intelligence, 34(7):1409-1422.
  8. Lee, S. and Horio, K. (2013). Human tracking using particle filter with reliable appearance model. InSICE Annual Conference (SICE), 2013 Proceedings of, pages 1418-1424.
  9. McIvor, A. (2000). Background subtraction techniques. In Proceedings of the Conference on Image and Vision Computing (IVCNZ 2000), pages 147-153, Hamilton, New Zealand.
  10. Rosten, E. and Drummond, T. (2006). Machine learning for high-speed corner detection. In Proceedings of the IEEE European Conference on Computer Vision (ECCV'06), volume 1, pages 430-443.
  11. Sun, L. and Liu, G. (2011). Visual object tracking based on combination of local description and global representation. Circuits and Systems for Video Technology, IEEE Transactions on, 21(4):408-420.
  12. Toyama, K., Krumm, J., Brumitt, B., and Meyers, B. (1999). Wallflower: principles and practice of background maintenance. In Proceedings of the International Conference on Computer Vision (ICCV 1999), pages 255-261. doi:10.1109/ICCV.1999.791228.
  13. Troya-Galvis, A., Gancarski, P., Passat, N., and BertiEquille, L. (2015). Unsupervised quantification of under- and over-segmentation for object-based remote sensing image analysis. Selected Topics in Applied Earth Observations and Remote Sensing, IEEE Journal of, 8(5):1936-1945.
  14. Wang, L., Yan, H., yu Wu, H., and Pan, C. (2013). Forward-backward mean-shift for visual tracking with local-background-weighted histogram. Intelligent Transportation Systems, IEEE Transactions on, 14(3):1480-1489.
  15. Yang, F., Lu, H., and Yang, M. (2014). Robust superpixel tracking. IEEE Transactions on Image Processing, 23(4):1639-1651.
  16. Zuniga, M. D., Bremond, F., and Thonnat, M. (2011). Real-time reliability measure driven multi-hypothesis tracking using 2d and 3d features. EURASIP Journal on Advances in Signal Processing, 2011(1):142. doi:10.1186/1687-6180-2011-142.

Paper Citation

in Harvard Style

Zuniga M. and Orellana C. (2016). Robust Real-time Tracking Guided by Reliable Local Features . In Proceedings of the 11th Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 4: VISAPP, (VISIGRAPP 2016) ISBN 978-989-758-175-5, pages 59-69. DOI: 10.5220/0005727600590069

in Bibtex Style

author={Marcos D. Zuniga and Cristian M. Orellana},
title={Robust Real-time Tracking Guided by Reliable Local Features},
booktitle={Proceedings of the 11th Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 4: VISAPP, (VISIGRAPP 2016)},

in EndNote Style

JO - Proceedings of the 11th Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 4: VISAPP, (VISIGRAPP 2016)
TI - Robust Real-time Tracking Guided by Reliable Local Features
SN - 978-989-758-175-5
AU - Zuniga M.
AU - Orellana C.
PY - 2016
SP - 59
EP - 69
DO - 10.5220/0005727600590069