Automatic View Finding for Drone Photography based on Image Aesthetic Evaluation

Xiaoliang Xiong, Jie Feng, Bingfeng Zhou

2017

Abstract

Consumer-level remotely controlled smart drones are usually equipped with high resolution cameras, which make them possible to become unmanned "flying camera". For this purpose, in this paper, we propose an automatic view finding scheme which can autonomously navigate a drone to an proper space position where a photo with an optimal composition can be taken. In this scheme, an automatic aesthetic evaluation for image composition is introduced to navigate the flying drone. It is accomplished by applying commonly used composition guidelines on the image transmitted from the drone at current view. The evaluation result is then conversely used to control the flight and provide feedback for the drone to determine its next movement. In flight control, we adopt a downhill simplex strategy to search for the optimal position and viewing direction of the drone in its flying space. When the searching converges, the drone stops and take an optimal image at current position.

References

  1. Bachrach, A., Prentice, S., He, R., and Roy, N. (2011). Range robust autonomous navigation in gps-denied environments. Journal of Field Robotics, 28(5):644- 666.
  2. Benet, G., Blanes, F., Sim, J. E., and Prez, P. (2002). Using infrared sensors for distance measurement in mobile robots. Robotics & Autonomous Systems, 40(4):255- 266.
  3. Bills, C., Chen, J., and Saxena, A. (2011). Autonomous mav flight in indoor environments using single image perspective cues. In IEEE International Conference on Robotics and Automation (ICRA),2011, pages 5776-5783.
  4. Byers, Z., Dixon, M., Goodier, K., Grimm, C. M., and Smart, W. D. (2003). An autonomous robot photographer. In IROS 2003, volume 3, pages 2636- 2641 vol.3.
  5. Comaniciu, D. and Meer, P. (2002). Mean shift: a robust approach toward feature space analysis. IEEE Transactions on Pattern Analysis & Machine Intelligence, 24(5):603-619.
  6. Duda, R. O. and Hart, P. E. (1972). Use of the hough transformation to detect lines and curves in pictures. Communications of the ACM, 15(1):11-15.
  7. Fu, H., Han, X., and Phan, Q. H. (2013). Data-driven suggestions for portrait posing. In SIGGRAPH Asia 2013, Technical Briefs, pages 29:1-29:4.
  8. Hrabar, S. (2008). 3d path planning and stereo-based obstacle avoidance for rotorcraft uavs. In IROS, pages 807-814.
  9. Jin, Y., Wu, Q., and Liu, L. (2012). Aesthetic photo composition by optimal crop-and-warp. Computers & Graphics, 36(8):955-965.
  10. Joubert, N., Roberts, M., Truong, A., Berthouzoz, F., and Hanrahan, P. (2015). An interactive tool for designing quadrotor camera shots. ACM Transactions on Graphics, 34(6):238.
  11. Ke, Y., Tang, X., and Jing, F. (2006). The design of high-level features for photo quality assessment. In CVPR'06, volume 1, pages 419-426.
  12. Kim, M.-J., Song, T. H., Jin, S. H., Jung, S. M., Go, G.-H., Kwon, K. H., and Jeon, J. W. (2010). Automatically available photographer robot for controlling composition and taking pictures. In IROS, pages 6010-6015.
  13. Krages (2005). Photography: The Art of Composition. Allworth Press.
  14. Lenz, I., Gemici, M., and Saxena, A. (2012). Low-power parallel algorithms for single image based obstacle avoidance in aerial robots. In IROS, pages 772-779.
  15. Li, K., Yan, B., Li, J., and Majumder, A. (2015). Seam carving based aesthetics enhancement for photos. Signal Processing-image Communication, 39:509- 516.
  16. Liu, L., Chen, R. C., Wolf, L., and Cohenor, D. (2010). Optimizing photo composition. Computer Graphics Forum, 29(2):469-478.
  17. Luo, Y. and Tang, X. (2008). Photo and video quality evaluation: Focusing on the subject. In ECCV 2008, Marseille, France, October 12-18, pages 386-399.
  18. Ni, B., Xu, M., Cheng, B., Wang, M., Yan, S., and Tian, Q. (2013). Learning to photograph: A compositional perspective. Trans. Multi., 15(5):1138-1151.
  19. Press, W. H., Teukolsky, S. A., Vetterling, W. T., and Flannery, B. P. (1992). Numerical Recipes in C: The Art of Scientific Computing . Cambridge University Press, New York, NY, USA, 2nd edition.
  20. Roberts, M. and Hanrahan, P. (2016). Generating dynamically feasible trajectories for quadrotor cameras. ACM Transactions on Graphics, 35(4):61.
  21. Soundararaj, S. P., Sujeeth, A. K., and Saxena, A. (2009). Autonomous indoor helicopter flight using a single onboard camera. In IROS, pages 5307-5314.
  22. Viola, P. and Jones, M. J. (2004). Robust real-time face detection. International Journal of Computer Vision, 57(2):137-154.
  23. Yao, L., Suryanarayan, P., Qiao, M., Wang, J. Z., and Li, J. (2012). Oscar: On-site composition and aesthetics feedback through exemplars for photographers. International Journal of Computer Vision, 96(3):353-383.
Download


Paper Citation


in Harvard Style

Xiong X., Feng J. and Zhou B. (2017). Automatic View Finding for Drone Photography based on Image Aesthetic Evaluation . In Proceedings of the 12th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 1: GRAPP, (VISIGRAPP 2017) ISBN 978-989-758-224-0, pages 282-289. DOI: 10.5220/0006255402820289


in Bibtex Style

@conference{grapp17,
author={Xiaoliang Xiong and Jie Feng and Bingfeng Zhou},
title={Automatic View Finding for Drone Photography based on Image Aesthetic Evaluation},
booktitle={Proceedings of the 12th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 1: GRAPP, (VISIGRAPP 2017)},
year={2017},
pages={282-289},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0006255402820289},
isbn={978-989-758-224-0},
}


in EndNote Style

TY - CONF
JO - Proceedings of the 12th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 1: GRAPP, (VISIGRAPP 2017)
TI - Automatic View Finding for Drone Photography based on Image Aesthetic Evaluation
SN - 978-989-758-224-0
AU - Xiong X.
AU - Feng J.
AU - Zhou B.
PY - 2017
SP - 282
EP - 289
DO - 10.5220/0006255402820289