top of page

"Organ Music" for RSU Anatomy Museum: first sounding.



Thanks to Vadims Pitlans, Arvydas Kazlauskas, and Hailong Zhang for lending me recordings of their brain activity, and also especially to my dear friend Martins Dabolins, of Connection Codes for developing the technology that enabled me to create these pieces for RSU Anatomy Museum's Museum Night project "Organ Music" which first sounds this evening, May 13.


A description can be found here:


Music starts and ends in the human brain. It begins in the imagination, ends in the cognition. Through the eyes of a musical artist, the EEG signal is a fascinating material to work with. It is complex, densely rhythmic, and it is able to reveal hidden aspects of the human experience. This is also what art does - it reveals something beautiful or important about the human experience.


This work presents 3 pieces of music created using the EEG signal with different approaches. It may be played and looped as a playlist in any order, or as standalone pieces for any purpose fitting the Organ Music theme.


The author has been developing Brain-Computer Music Interface (BCMI) systems for his doctoral research project for the past few years. These systems transform the electroencephalograph (EEG) signals originating from the brain, and map them to musical or visual outputs in real-time. In this work, EEG signals collected from 3 musicians in the autumn of 2022 were used to control various aspects of the 3 resulting pieces of music. During the data collection, each musician was tasked to freely improvise for any duration of time, but to alternate between two contrasting expressive intentions- excitement or relaxation- every 30 to 60 seconds. These contrasting conditions create different patterns in the EEG which can be detected in real-time and mapped to define musical parameters such as tempo, rhythm and pitch, or to control timbral and spatial parameters such as distortion and reverb.


Track 1: Sparks of Imagination (12:43)

The music was made using the brainwaves of the classical pianist Hailong Zhang, performing free improvisation. His recorded EEG signals (12 channels) were streamed through a BCMI system, and mapped to MIDI commands which control 4 different virtual electric piano instruments within a music software. Brainwaves associated with expressing levels of Distress, Excitement, Depression, and Relaxation (based on original published data analysis using 32 channels) were used to control the master volume faders of each of the 4 pianos. Each piano was assigned an arpeggiator, each with a different rhythmic ratio. The number of octaves the arpeggiation takes place over, was defined by modulating levels of Distress and Excitement combined. The amount of reverb added to the master output was similarly mapped to modulating levels of Depression and Relaxation combined. All 4 virtual pianos were armed to record incoming notes from a MIDI keyboard, as well as automation curves representing the brain-controlled parameters described above modulating over time.


As a result, this piece is made up of a simple chord progression repeated 3 times, performed on the MIDI keyboard, interacting with the brainwaves of the pianist to embellish and shape the sound with a mix of dense arpeggiated polyrhythms, varying pitch ranges and spatial qualities.


Track 2: Thinkin ‘bout Jazz (2:33)

The music was made using the brainwaves of the classical saxophonist, Arvydas Kazlauskas, performing free improvisation. His recorded EEG signals were streamed through a BCMI system, and mapped to trigger MIDI notes whenever a synchronisation event took place between any pair of electrodes (total of 8 channels) placed on his head. Synchronisation events were defined as moments when specific spectral power frequencies occur simultaneously at the same amplitude in different regions of the brain. Speculatively, a human brain engaged in the creative act of music improvisation was expected to produce a high number of synchronisation events when compared to a resting state or score playing task.


The MIDI notes were initially mapped to a C major scale and received in music software by 3 virtual electronic pianos, each with different musical rules applied to define the rhythmic quantisation, arpeggiation, transposition and chord trigger harmonies for each incoming synchronisation event. Rules were applied with jazz rhythms and harmonies in this recording, but the phrasing and sequence of notes were determined by the timing and location of synchronisation events occurring in the saxophonist’s brain. The result is a short, playful piece that resembles jazz in some respects but is rhythmically and melodically baffling, and a little bit humorous.


Track 3: Neon Wavelengths (10:52)

The music was made using the brainwaves of the electronic musician, Vadims Pitlans, freely improvising on an array of analogue drum machines and sequencers. Speculatively, the creative process and body movement involved in performing electronic music is different from academic music. However, the same expressive goals can be achieved using different fundamental approaches, thus the electronic musician was a unique case study. Similar to the previous piece, his recorded EEG signals were streamed through a BCMI system and mapped to trigger MIDI notes whenever synchronisation events occurred between any of 8 channels. In this piece, synchronisation events were mapped to trigger MIDI pitches making up a C major 7 chord. These MIDI notes were received by 5 virtual instruments within music software: Drum kit, Marimba, Vibraphone, Synth Pad, Physical Modelling Synth. Each were assigned different rhythmic quantisation, arpeggiation and transposition rules to apply to each incoming synchronisation event. These rules were applied with the intention to create an ambient piece of music, with an unexpected flow of broken rhythms voiced by the drum kit.


The author manipulated the structure of the music by adding or subtracting the signals from electrodes one by one. Since synchronisation events between different electrodes trigger different notes already in a triadic relationship (Cmaj7), chords could be intentionally formed by adding or subtracting specific electrodes during the recording. Note velocity was defined by the amplitude of each synchronisation event, thus the raw EEG signals could be monitored for when to expect louder articulations in the process. The result is a balance of improvisation taking place within an electronic musician’s brain and the author’s brain.




For more information on the research behind these works, please visit my research page.

Comments


bottom of page