Cochlea-based Features for Music Emotion Classification

Luka Kraljević, Mladen Russo, Mia Mlikota, Matko Šarić

2017

Abstract

Listening to music often evokes strong emotions. With the rapid growth of easily-accessible digital music libraries there is an increasing need in reliable music emotion recognition systems. Common musical features like tempo, mode, pitch, clarity, etc. which can be easily calculated from audio signal are associated with particular emotions and are often used in emotion detection systems. Based on the idea that humans don’t detect emotions from pure audio signal but from a signal that had been previously processed by the cochlea, in this work we propose new cochlear based features for music emotion recognition. Features are calculated from the gammatone filterbank model output and emotion classification is then performed using Support Vector Machine (SVM) and TreeBagger classifiers. Proposed features are evaluated on publicly available 1000 songs database and compared to other commonly used features. Results show that our approach is effective and outperforms other commonly used features. In the combined features set we achieved accuracy of 83.88% and 75.12% for arousal and valence.

Download


Paper Citation


in Harvard Style

Kraljević L., Russo M., Mlikota M. and Šarić M. (2017). Cochlea-based Features for Music Emotion Classification . In Proceedings of the 14th International Joint Conference on e-Business and Telecommunications - Volume 1: SIGMAP, (ICETE 2017) ISBN 978-989-758-260-8, pages 64-68. DOI: 10.5220/0006466900640068


in Bibtex Style

@conference{sigmap17,
author={Luka Kraljević and Mladen Russo and Mia Mlikota and Matko Šarić},
title={Cochlea-based Features for Music Emotion Classification},
booktitle={Proceedings of the 14th International Joint Conference on e-Business and Telecommunications - Volume 1: SIGMAP, (ICETE 2017)},
year={2017},
pages={64-68},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0006466900640068},
isbn={978-989-758-260-8},
}


in EndNote Style

TY - CONF
JO - Proceedings of the 14th International Joint Conference on e-Business and Telecommunications - Volume 1: SIGMAP, (ICETE 2017)
TI - Cochlea-based Features for Music Emotion Classification
SN - 978-989-758-260-8
AU - Kraljević L.
AU - Russo M.
AU - Mlikota M.
AU - Šarić M.
PY - 2017
SP - 64
EP - 68
DO - 10.5220/0006466900640068