Towards E-Motion Based Music Retrieval: A study of Affective Gesture Recognition
Sprache des Titels:
The widespread availability of digitised music collections and mobile music players have enabled us to listen to music during
many of our daily activities, such as exercise, commuting, relaxation, and many people enjoy that opportunity. A practical problem that
comes along with the wish to listen to music is that of music retrieval, the selection of desired music from a music collection. In this
paper we propose a new approach to facilitate music retrieval. Modern smart phones are commonly used as music players, and are
already equipped with inertial sensors that are suitable for obtaining motion information. In the proposed approach, emotion is derived
automatically from arm gestures, and is used to query a music collection. We set-up predictive models for valence and arousal from
empirical data, gathered in an experimental setup where inertial data recorded from arm movements is coupled to musical emotion.
Part of the experiment is a preliminary study confirming that human subjects are generally capable of recognising affect from arm
gestures. Model validation in the main study confirmed the predictive capabilities of the models.