User-driven Nearest Neighbour Exploration of Image Archives

Luca Piras, Deiv Furcas, Giorgio Giacinto

2015

Abstract

Learning what a specific user is exactly looking for, during a session of image search and retrieval, is a problem that has been mainly approached with “classification” or “exploration” techniques. Classification techniques follow the assumption that the images in the archive are statically subdivided into classes. Exploration approaches, on the other hand, are more focused on following the varying needs of the user. It turns out that image retrieval techniques based on classification approaches, though often showing good performances, are not prone to adapt to different users’ goals. In this paper we propose a relevance feedback mechanism that drives the search into promising regions of the feature space according to the Nearest Neighbor paradigm. In particular, each image labelled as being relevant by the user, is used as a “seed” for an exploration of the space based on the Nearest Neighbors paradigm. Reported results show that this technique allows attaining higher recall and average precision performances than other state-of-the-art relevance feedback approaches.

References

  1. Arevalillo-Herráez, M. and Ferri, F. J. (2010). Interactive image retrieval using smoothed nearest neighbor estimates. In Hancock et al. ed., SSPR/SPR, LNCS 6218, pp 708-717. Springer.
  2. Arevalillo-Herráez, M. and Ferri, F. J. (2013). An improved distance-based relevance feedback strategy for image retrieval. Image Vision Comput., 31(10):704-713.
  3. Arevalillo-Herráez, M., Ferri, F. J., and Domingo, J. (2010). A naive relevance feedback model for content-based image retrieval using multiple similarity measures. Pattern Recognition, 43(3):619-629.
  4. Boiman, O., Shechtman, E., and Irani, M. (2008). In defense of nearest-neighbor based image classification. In CVPR 2008. IEEE Computer Society.
  5. Chen, Y., Zhou, X. S., and Huang, T. (2001). One-class svm for learning in image retrieval. In ICIP 2001, volume 1, pp 34 -37.
  6. Cohn, D. A., Atlas, L. E., and Ladner, R. E. (1994). Improving generalization with active learning. Machine Learning, 15(2):201-221.
  7. Datta, R., Joshi, D., Li, J., and Wang, J. Z. (2008). Image retrieval: Ideas, influences, and trends of the new age. ACM Computing Surveys, 40(2):1-60.
  8. Ding, C. H. Q. and He, X. (2004). K-nearest-neighbor consistency in data clustering: incorporating local information into global optimization. In Haddad et al. ed., SAC 2004, pp 584-589. ACM.
  9. García, S., Herrera, F., and Shawe-taylor, J. An extension on “statistical comparisons of classifiers over multiple data sets” for all pairwise comparisons. Journal of Machine Learning Research, pp 2677-2694.
  10. Giacinto, G. (2007). A nearest-neighbor approach to relevance feedback in content based image retrieval. In CIVR 7807, pp 456-463, New York, NY, USA. ACM.
  11. Giacinto, G. and Roli, F. (2004a). Bayesian relevance feedback for content-based image retrieval. Pattern Recognition, 37(7):1499-1508.
  12. Giacinto, G. and Roli, F. (2004b). Nearest-prototype relevance feedback for content based image retrieval. In ICPR 2004, pp 989-992.
  13. Hoi, S. C. H., Jin, R., Zhu, J., and Lyu, M. R. (2009). Semisupervised svm batch mode active learning with applications to image retrieval. ACM Trans. Inf. Syst., 27(3):16:1-16:29.
  14. Laaksonen, J., Koskela, M., and Oja, E. (2002). PicSOMself-organizing image retrieval with MPEG-7 content descriptors. IEEE Transactions on Neural Networks, 13(4):841-853.
  15. Lew, M. S., Sebe, N., Djeraba, C., and Jain, R. (2006). Content-based multimedia information retrieval: State of the art and challenges. ACM Trans. Multimedia Comput. Commun. Appl., 2(1):1-19.
  16. Lindenbaum, M., Markovitch, S., and Rusakov, D. (2004). Selective sampling for nearest neighbor classifiers. Machine Learning, 54(2):125-152.
  17. Pavlidis, T. (2008). Limitations of content-based image retrieval. Technical report, Stony Brook University.
  18. Piras, L., Giacinto, G., and Paredes, R. (2012). Enhancing image retrieval by an exploration-exploitation approach. In Perner, P. ed., MLDM 2012, LNCS 7376, pp 355-365. Springer.
  19. Rao, Y., Mundur, P., and Yesha, Y. (2006). Fuzzy svm ensembles for relevance feedback in image retrieval. In Sundaram et al. ed., CIVR 2006, LNCS 4071, pp 350- 359. Springer.
  20. Rocchio, J. J. (1971). Relevance feedback in information retrieval. In Salton, G. ed., The SMART Retrieval System - Experiments in Automatic Document Processing, pp 313-323. Prentice Hall, Englewood, Cliffs, New Jersey.
  21. Rui, Y., Huang, T. S., and Mehrotra, S. (1997). ContentBased image retrieval with relevance feedback in MARS. In International Conference on Image Processing Proceedings, pp 815-818.
  22. Sivic, J. and Zisserman, A. (2008). Efficient visual search for objects in videos. Proceedings of the IEEE, 96(4):548 -566.
  23. Swain, M. J. and Ballard, D. H. (1991). Color indexing. International Journal of Computer Vision, 7(1):11-32.
  24. Thomee, B. and Lew, M. S. (2012). Interactive search in image retrieval: a survey. IJMIR, 1(1):71-86.
  25. Tong, S. and Chang, E. Y. (2001). Support vector machine active learning for image retrieval. In ACM Multimedia, pp 107-118.
  26. Tronci, R., Murgia, G., Pili, M., Piras, L., and Giacinto, G. (2013). Imagehunter: A novel tool for relevance feedback in content based image retrieval. In Lai et al. ed., New Challenges in Distributed Information Filtering and Retrieval, pp 53-70. Springer Berlin Heidelberg.
  27. Wang, M., Yang, K., Hua, X.-S., and Zhang, H. (2010). Towards a relevant and diverse search of social images. IEEE Transactions on Multimedia, 12(8):829-842.
  28. Zhang, L., Lin, F., and Zhang, B. (2001). Support vector machine learning for image retrieval. In ICIP 2001, pp 721-724.
  29. Zhou, X. S. and Huang, T. S. (2003). Relevance feedback in image retrieval: A comprehensive review. Multimedia Syst., 8(6):536-544.
Download


Paper Citation


in Harvard Style

Piras L., Furcas D. and Giacinto G. (2015). User-driven Nearest Neighbour Exploration of Image Archives . In Proceedings of the International Conference on Pattern Recognition Applications and Methods - Volume 1: ICPRAM, ISBN 978-989-758-076-5, pages 181-189. DOI: 10.5220/0005183401810189


in Bibtex Style

@conference{icpram15,
author={Luca Piras and Deiv Furcas and Giorgio Giacinto},
title={User-driven Nearest Neighbour Exploration of Image Archives},
booktitle={Proceedings of the International Conference on Pattern Recognition Applications and Methods - Volume 1: ICPRAM,},
year={2015},
pages={181-189},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0005183401810189},
isbn={978-989-758-076-5},
}


in EndNote Style

TY - CONF
JO - Proceedings of the International Conference on Pattern Recognition Applications and Methods - Volume 1: ICPRAM,
TI - User-driven Nearest Neighbour Exploration of Image Archives
SN - 978-989-758-076-5
AU - Piras L.
AU - Furcas D.
AU - Giacinto G.
PY - 2015
SP - 181
EP - 189
DO - 10.5220/0005183401810189