As part of Dr Andrew Johnston’s Extreme Programming workshop at the UTS School of Software, I been working as a ‘client’ with Tuan Manh Vu, Lukas Blödorn, Christoph Eibisberger and Monette Tan on cross-platform application for real time power spectrum density (PSD) analysis of heart rate variability data.
What I have been searching for is a way to track resonant behaviour in the heart rate patterning – that is especially pronounced during slow (around or below six breaths per minute), relaxed breathing (i.e. breath-based meditation, and in some cases, feelings of love/connection/appreciation/spirituality etc). Some researchers and biofeedback enthusiast have called this the ‘meditators peak’ since it becomes particularly pronounced during many forms of breath-based (tan-den) meditation techniques ie. yoga, zazen, chanting etc. This potential to create what David Rosenboom has called “attention dependant” structures continues to fascinate me – and I’m looking forward to seeing what I can build using these new tools.
Above: example of power spectrum density analysis display of heart rate variatibility (HRV), showing baseline recording (bottom), early stages of resonance (middle), and full resonant ‘meditators’ peak (top).
I’ve been pursuing this method for many years now – but for one reason or another I’ve never quite got there. Its one thing to obtain an frequency domain (i.e. spectrum) analysis of these heart rate patterns – but it’s another to extract data from this display that you can use for an interactive artwork/biofeedback display! There are plenty of applications that can perform these analyses – but currently none of them offer a way to transmit this data in realtime to other applications. In this project we are using OpenSoundControl to send and receive data between the analysis app and Max-MSP – which I’m using to transmit the sensor signal, and receive the completed spectrum and resonance analysis.
From the outset I also wanted the code to be as scalable as possible: able to be deployed across a wide variety of operating systems (Windows, OSX and iOS especially). The project required the students to familiarise themselves with a lot of new problems: OpenSoundControl inter-application networking and data transmission, bio-statistical analyses and C++.
The outcome is an application run from the Terminal app that I can use to measure the amount of resonance in three separate bands of the heart rate variability spectrum. This will enable me to create interactions that respond to very specific breath rates and affective (emotional/mental/attentive) states: providing not only a measure of resonance (measured as a percentage of total energy concentrated around a given frequency), but also the specific frequency of that resonance (that could be used to synchronise music, movements effects etc. in time with each person’s specific resonant frequency).
The code is still in ‘Alpha’ state, but I’ll upload a version of it soon, and it’s my plan to maintain it as an opensource project so that other interaction researchers can use it and maintain it.