Classification and Regression of Music Lyrics: Emotionally-Significant Features

Ricardo Malheiro, Renato Panda, Paulo Gomes, Rui Pedro Paiva

2016

Abstract

This research addresses the role of lyrics in the music emotion recognition process. Our approach is based on several state of the art features complemented by novel stylistic, structural and semantic features. To evaluate our approach, we created a ground truth dataset containing 180 song lyrics, according to Russell’s emotion model. We conduct four types of experiments: regression and classification by quadrant, arousal and valence categories. Comparing to the state of the art features (ngrams - baseline), adding other features, including novel features, improved the F-measure from 68.2%, 79.6% and 84.2% to 77.1%, 86.3% and 89.2%, respectively for the three classification experiments. To study the relation between features and emotions (quadrants) we performed experiments to identify the best features that allow to describe and discriminate between arousal hemispheres and valence meridians. To further validate these experiments, we built a validation set comprising 771 lyrics extracted from the AllMusic platform, having achieved 73.6% F- measure in the classification by quadrants. Regarding regression, results show that, comparing to similar studies for audio, we achieve a similar performance for arousal and a much better performance for valence.

References

  1. Besson, M., Faita, F., Peretz, I., Bonnel, A., Requin, J. 1998. Singing in the brain: Independence of lyrics and tunes, Psychological Science, 9.
  2. Boser, B., Guyon, I., Vapnik, V. 1992. A training algorithm for optimal margin classifiers. Proc. of the Fifth Ann. Workshop on Computational Learning Theory, pages 144-152.
  3. Bradley, M., Lang, P. 1999. Affective Norms for English Words (ANEW): Stimuli, Instruction Manual and Affective Ratings. Technical report C-1, The Center for Research in Psychophysiology, University of Florida.
  4. Downie, J. 2008. The music information retrieval evaluation exchange (2005-2007): A window into music information retrieval research. Acoustical Science and Technology, vol. 29, no. 4, pp. 247-255.
  5. Duda, R., Hart, P., Stork, D. 2000. Pattern Recognition. New York, John Wiley & Sons, Inc.
  6. Fontaine, J., Scherer, K., Soriano, C. 2013. Components of Emotional Meaning. A Sourcebook. Oxford University Press.
  7. Hu, X., Downie, J., Laurier, C., Bay, M., Ehmann, A. 2008. The 2007 MIREX audio mood classification task: Lessons learned. In Proc. of the Intl. Conf. on Music Information Retrieval, Philadelphia, PA.
  8. Hu, Y., Chen, X., Yang, D. 2009. Lyric-Based Song Emotion Detection with Affective Lexicon and Fuzzy Clustering Method. Tenth Int. Society for Music Information Retrieval Conference.
  9. Hu, X., Downie, J., Ehmann, A. 2009. Lyric text mining in music mood classification. Proc. of the Tenth Int. Society for Music Information Retrieval Conference (ISMIR), Kobe, Japan, pages 411-416.
  10. Hu, X., Downie, J. 2010. Improving mood classification in music digital libraries by combining lyrics and audio. Proc. Tenth Ann. joint conf. on Digital libraries, pp. 159-168.
  11. Juslin, P., Laukka, P. 2004. Expression, Perception, and Induction of Musical Emotions: A Review and a Questionnaire Study of Everyday Listening. Journal of New Music Research, 33 (3), 217-238.
  12. Keerthi, S. and Lin, C. 2003. Asymptotic behaviors of support vector machines with Gaussian kernel. Neural Computation, 15(7):1667-1689.
  13. Krippendorff, K. 2004. Content Analysis: An Introduction to its Methodology. 2nd edition, chapter 11. Sage, Thousand Oaks, CA.
  14. Laurier, C., Grivolla, J., Herrera, P. 2008. Multimodal music mood classification using audio and lyrics. Proc. of the Int. Conf. on Machine Learning and Applications.
  15. Lu, C., Hong, J-S., Cruz-Lara, S. 2006. Emotion Detection in Textual Information by Semantic Role Labeling and Web Mining Techniques. Third Taiwanese-French Conf. on Information Technology.
  16. Mayer, R., Neumayer, R., Rauber, A. 2008. Rhyme and Style Features for Musical Genre Categorization by Song Lyrics. Proc. of the Int. Conf. on Music Information Retrieval (ISMIR), pp. 337-342.
  17. Montgomery, D., Runger G., Hubele, N. 1998. Engineering Statistics. Wiley.
  18. Robnik-Šikonja, M., Kononenko, I. 2003. Theoretical and Empirical Analysis of ReliefF and RreliefF. Machine Learning, vol. 53, no. 1-2, pp. 23-69.
  19. Russell, J. 1980. A circumspect model of affect,” Journal of Psychology and Social Psychology, vol. 39, no. 6, p. 1161.
  20. Russell, J. 2003. Core affect and the psychological construction of emotion. Psychol. Review, 110, 1, 145-172.
  21. Sebastiani, F. 2002. Machine learning in automated text categorization. ACM Computing Surveys, 34(1):1-47.
  22. Strapparava, C., Valitutti, A. 2004. Wordnet-affect: an affective extension of wordnet. In Proceedings of the 4th International Conference on Language Resources and Evaluation, pp. 1083-1086, Lisbon.
  23. Taylor, A., Marcus, M., Santorini, B. 2003. The Penn Treebank: an overview. Series Text, Speech and Language Technology. Ch1. 20, 5-22.
  24. Vignoli, F. 2004. Digital Music Interaction concepts: a user study. Proc. of the 5th Int. Conference on Music Information Retrieval.
  25. Whissell, C., 1989. Dictionary of Affect in Language. In Plutchik and Kellerman Emotion: Theory, Research and Experience, vol 4, pp. 113-131, Academic Press, NY.
  26. Yang, Y., Lin, Y., Su, Y., Chen H. 2008. A regression approach to music emotion recognition. IEEE Transactions on audio, speech, and language processing, vol. 16, No. 2, pp. 448-457.
  27. Yang, D., and Lee, W-S. 2009. Music Emotion Identification from Lyrics. Eleventh IEEE Int. Symposium of Multimedia.
  28. Zaanen, M., Kanters, P. 2010. Automatic Mood Classification using tf*idf based on Lyrics. in J. Stephen Downie and Remco C. Veltkamp, editors, 11th International Society for Music Information and Retrieval Conference.
Download


Paper Citation


in Harvard Style

Malheiro R., Panda R., Gomes P. and Paiva R. (2016). Classification and Regression of Music Lyrics: Emotionally-Significant Features . In Proceedings of the 8th International Joint Conference on Knowledge Discovery, Knowledge Engineering and Knowledge Management - Volume 1: KDIR, (IC3K 2016) ISBN 978-989-758-203-5, pages 45-55. DOI: 10.5220/0006037400450055


in Bibtex Style

@conference{kdir16,
author={Ricardo Malheiro and Renato Panda and Paulo Gomes and Rui Pedro Paiva},
title={Classification and Regression of Music Lyrics: Emotionally-Significant Features},
booktitle={Proceedings of the 8th International Joint Conference on Knowledge Discovery, Knowledge Engineering and Knowledge Management - Volume 1: KDIR, (IC3K 2016)},
year={2016},
pages={45-55},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0006037400450055},
isbn={978-989-758-203-5},
}


in EndNote Style

TY - CONF
JO - Proceedings of the 8th International Joint Conference on Knowledge Discovery, Knowledge Engineering and Knowledge Management - Volume 1: KDIR, (IC3K 2016)
TI - Classification and Regression of Music Lyrics: Emotionally-Significant Features
SN - 978-989-758-203-5
AU - Malheiro R.
AU - Panda R.
AU - Gomes P.
AU - Paiva R.
PY - 2016
SP - 45
EP - 55
DO - 10.5220/0006037400450055