We present a case-study in the development of a realtime “hyperscanning” auditory interface that transforms realtime brainwave-similarity between interacting dyads into music. Our instrument extends reality in face-to-face communication with a musical stream reflecting an invisible socio-neurophysiological signal. This instrument contributes to the historical context of brain-computer interfaces (BCIs) applied to art and music, but is unique because it is contingent on the correlation between the brainwaves of the dyad, and because it conveys this information using entirely auditory feedback. We designed the instrument to be i) easy to understand, ii) relatable and iii) pleasant for members of the general public in an exhibition context. We present how this context and user group led to our choice of EEG hardware, inter-brain similarity metric, and our auditory mapping strategy. We discuss our experience following four public exhibitions, as well as future improvements to the instrument design and user experience.
Log In to access the entire content