
Research
Brain-Computer Music Interfacing for Embodied Music Interaction applications
Prepared in relation with doctoral dissertation in Systematic Musicology at Jāzeps Vītols Latvian Academy of Music.
Overview
This research aims to develop tools that harness the human electroencephalograph (EEG) signal for real-time music interaction. Over the last few years I have worked with neuroscientists and visual artists to develop a Brain-Computer Music Interface (BCMI) system that enables a single musician to to control musical and visual media using their EEG signals in a live performance context.
BCMI systems consist of EEG hardware, and computers running software which receives, filters and decodes the users brainwaves, then maps them to manipulate musical or visual parameters in real time. The videos listed in the media content below document the process of investigating, testing and demonstrating various solutions. The BCMI system developed within the frames of this research project was based on decoding the expressive intentions of a performer in two contrasting states: high arousal and low arousal. This was done by characterising spectral power during emotionally expressive music performance relative to emotionally neutral music performance.
The next step will be to extend these tools for multiple users, in which inter-brain dynamics during co-creative tasks can be used to manipulate immersive multimedia. In other words, shared brain activity can be used to play a role in the creation or experience of art. Taking this further, developing these tools will lead to solutions in the wider field of BCI and Human-Machine Interaction, towards direct mind-to-mind communication networks.
Media content
Videos prepared in relation to this work have been uploaded as visual aids to certain steps in the BCMI design process and can be accessed in order via the two playlists below.
The BCMI system maps EEG patterns related to expressive intent during music performance on the arousal dimension to multiple audio/visual outputs.
Created in collaboration with research teams led by Valdis Bernhofs from Jāzeps Vītols Latvian Academy of Music, Inga Griškova-Bulanova from Vilnius University, Yuan-Pin Lin from National Sun Yat-sen University, and data visualisation expert Mārtiņš Dāboliņš.
Playlist A: Videos 1 - 11
BCMI design explorations
BCMI design explorations


1. Controlling electric guitar effects with the EEG using facial muscles

2. Visualisation and musification of local synchronisation events v.1

3. Visualisation and musification of local synchronisation events v.2
