Keyword-Based Approach for Lyrics Emotion Variation Detection



This research addresses the role of the lyrics in the context of music emotion variation detection. To accomplish this task we create a system to detect the predominant emotion expressed by each sentence (verse) of the lyrics. The system employs Russell’s emotion model and contains 4 sets of emotions associated to each quadrant. To detect the predominant emotion in each verse, we propose a novel keyword-based approach, which receives a sentence (verse) and classifies it in the appropriate quadrant. To tune the system parameters, we created a 129-sentence training dataset from 68 songs. To validate our system, we created a separate ground-truth containing 239 sentences (verses) from 44 songs annotated manually with an average of 7 annotations per sentence. The system attains 67.4% F-Measure score.


Music Emotion Recognition, Music Information Retrieval, Natural Language Processing

Related Project

MOODetector: A System for Mood-based Classification and Retrieval of Audio Music


8th International Conference on Knowledge Discovery and Information Retrieval – KDIR’2016, October 2016

PDF File

Cited by

No citations found