Dance Motion Segmentation Method based on Choreographic Primitives

Narumi Okada, Naoya Iwamoto, Tsukasa Fukusato, Shigeo Morishima

2015

Abstract

Data-driven animation using a large human motion database enables the programing of various natural human motions. While the development of a motion capture system allows the acquisition of realistic human motion, segmenting the captured motion into a series of primitive motions for the construction of a motion database is necessary. Although most segmentation methods have focused on periodic motion, e.g., walking and jogging, segmenting non-periodic and asymmetrical motions such as dance performance, remains a challenging problem. In this paper, we present a specialized segmentation approach for human dance motion. Our approach consists of three steps based on the assumption that human dance motion is composed of consecutive choreographic primitives. First, we perform an investigation based on dancer perception to determine segmentation components. After professional dancers have selected segmentation sequences, we use their selected sequences to define rules for the segmentation of choreographic primitives. Finally, the accuracy of our approach is verified by a user-study, and we thereby show that our approach is superior to existing segmentation methods. Through three steps, we demonstrate automatic dance motion synthesis based on the choreographic primitives obtained.

References

  1. Alankus, G., Bayazit, A. A., and Bayazit, O. B., 2005. Automated Motion Synthesis for Dancing Characters. Computer Animation and Virtual Worlds, 16(3-4), 259-271.
  2. Arikan, O., and Forsyth, D., 2002. Interactive Motion Generation from Examples. ACM Transaction on Graphics, 21(3): 483-490.
  3. Barbic, J., Pan, J. Y., Faloutsos, C., Hodginsj, K., and Pollard, N., 2004. Segmenting motion capture data into distinct behaviors. In Proc. of Graphics Interface, 2(6): 185-194.
  4. Beaudoin, P., Coros, S., Panne, M., and Poilin, P., 2008. Motion-motif graphs. In Proceedings of the ACM SIGGRAPH/Eurographics Symposium on Computer Animation, 117-126.
  5. Fan, R., Xu, S., and Geng, W., 2012. Example-based Automatic Music-Driven Conventional Dance Motion Synthesis. IEEE Transactions on Visualization and Computer Graphics, 18(3): 501-515.
  6. Kim, G., Wang, Y., and Seo, H., 2007. Motion Control of a Dancing Character with Music. In Proceedings of the 6th IEEE/ACIS International Conference on Computer and Information Science (ICIS), 930-936.
  7. Kim, T.H., Park, S.I., and Shin, S.Y., 2003. Rhythmic Motion Synthesis based on Motion-Beat Analysis. ACM Transaction on Graphics, 22(3): 392-401.
  8. Kovar, L., Gleicher, M., and Pighin, F., 2002. Motion Graphs. ACM Transaction on Graphics, 21(3): 473- 482.
  9. Laban, R., and Ullmann, L., 1971. The Mastery of Movement. Pre Textos.
  10. Lee, M., Lee, K., and Park, J., 2013. Music Similaritybased Approach to Generating Dance Motion Sequence. Multimedia Tools and Applications, 62(3): 895-912.
  11. Masurelle, A., Essid, S., and Richard, G., 2013. Multimodal Classification of Dance Movements using Body Joint Trajectories and Step Sounds. In Proc. 14th International Workshop on Image and Audio Analysis for Multimedia Interactive Services (WIAMIS), 1-4.
  12. Nakata, T., Mori, T., and Sato, T., 2002. Analysis of Impression of Robot Bodily Expression. The Journal of Robotics and Mechatronics, 14(1): 27-36.
  13. Osaki, R., Shimada, M., and Uehara, K., 2000. A Motion Recognition Method by Using Primitive Motions. VDB 5 Proceedings of the Fifth Working Conference on Visual Database Systems: Advances in Visual Information Management, 117-128.
  14. Rennhak, B., Shiratori, T., Kudoh, S., Vinayavekhin, P. and Ikeuchi, K., 2010. Detecting Dance Motion Structure Using Body Components and Turning Motions. In Proc. IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2010), 2264- 2269.
  15. Shiratori, T., Nakazawa, A., and Ikeuchi, K., 2004. Detecting Dance Motion Structure through Music Analysis. In Proc. IEEE International Conference on Automatic Face and Gesture Recognition (FGR 2004), 857-862.
  16. Shiratori, T., Nakazawa, A., and Ikeuchi, K., 2006. Dancing-to-Music Character Animation. Computer Graphics Forum, 25(3): 449-458.
  17. Vögele, A., Krüger, B., and Klein, R., 2014. Efficient unsupervised Temporal Segmentation of Human Motion. In Proceedings of the ACM SIGGRAPH / Eurographics Symposium on Computer Animation.
  18. Zhou, F., Torre, F., and Hodgins J. K., 2008. Aligned cluster analysis for temporal segmentation of human motion. In IEEE Conference on Automatic Face and Gestures Recognition. 1(2).
  19. Zhou, F., Torre, F., and Hodgins J. K., 2013. Hierarchical aligned cluster analysis for temporal clustering of human motion. IEEE Trans. on Pattern Analysis and Machine Intelligence (PAMI). 35(3): 582-596.
Download


Paper Citation


in Harvard Style

Okada N., Iwamoto N., Fukusato T. and Morishima S. (2015). Dance Motion Segmentation Method based on Choreographic Primitives . In Proceedings of the 10th International Conference on Computer Graphics Theory and Applications - Volume 1: GRAPP, (VISIGRAPP 2015) ISBN 978-989-758-087-1, pages 332-339. DOI: 10.5220/0005304303320339


in Bibtex Style

@conference{grapp15,
author={Narumi Okada and Naoya Iwamoto and Tsukasa Fukusato and Shigeo Morishima},
title={Dance Motion Segmentation Method based on Choreographic Primitives},
booktitle={Proceedings of the 10th International Conference on Computer Graphics Theory and Applications - Volume 1: GRAPP, (VISIGRAPP 2015)},
year={2015},
pages={332-339},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0005304303320339},
isbn={978-989-758-087-1},
}


in EndNote Style

TY - CONF
JO - Proceedings of the 10th International Conference on Computer Graphics Theory and Applications - Volume 1: GRAPP, (VISIGRAPP 2015)
TI - Dance Motion Segmentation Method based on Choreographic Primitives
SN - 978-989-758-087-1
AU - Okada N.
AU - Iwamoto N.
AU - Fukusato T.
AU - Morishima S.
PY - 2015
SP - 332
EP - 339
DO - 10.5220/0005304303320339