ABC News Story

Barbara Miller, ABC, Wednesday, September 19, 2012

Computer games and videos are already used extensively in hospitals to try to distract children who are undergoing treatment. Now doctors in Sydney are taking things one step further in an attempt to help young patients to manage their pain. They’re developing a computer application that rewards children with attractive images when they calm their nerves and lower their heartbeat. Barbara Miller has been finding out more…

Posted in medical research, prototype | Leave a comment

Exhibition at the Children’s Hospital at Westmead, Dec. 12 – 16

Our next prototype exhibition at the Children’s Hospital at Westmead, will be in December, from Monday the 12th to Friday the 16th, and will be held in the Galleria space (main reception area between Audiology and the Chemist’s), between 10am and 4pm every day.

We’ll be inviting children and young people between 7 and 17 to try out two of our heart-rate controlled iPad and iPod apps (as shown in our November, UTS exhibition), and then asking them to describe their experience and impressions of the interaction. Our aim is to learn more about the ways in which they respond to this type of interaction, the language they use, and the extent to which these interactions can support relaxation and focus (i.e. how long before its ‘boring’).

This exhibition will also be an opportunity to showcase our research so far to other staff at the hospital – and on Thursday 15th, we be hosting a reception for Australian Network for Art & Technology (ANAT) Program Manager, Vicki Sowry, who has been managing the residency Synapse Residency program of which BrightHearts has been a part of.

What: Exhibition of prototype biofeedback  iPhone/iPad apps
When: Monday December 12 – Friday December 16th, 2011, 10am – 4pm
Where: The Galleria, Children’s Hospital at Westmead,
212 Hawkesbury Road, Westmead NSW 2145

Reception Event with Vickie Sowry,
Australian Network for Art & Technology:

Thursday December 15th, 5pm-7pm
The Function Room (next to staff canteen on Ground Floor)
The Children’s Hospital at Westmead

Posted in exhibition, observation, prototype | Leave a comment

Latest iPad Stills

Some stills from the latest version of the visuals on show at UTS DAB Lab.

     
     

Posted in exhibition, prototype | Leave a comment

Heart Rate Variability Analysis Software

As part of Dr Andrew Johnston’s  Extreme Programming workshop at the UTS School of Software, I been working as a ‘client’ with Tuan Manh Vu, Lukas Blödorn, Christoph Eibisberger and Monette Tan on cross-platform application for real time power spectrum density (PSD) analysis of heart rate variability data.

What I have been searching for is a way to track resonant behaviour in the heart rate patterning – that is especially pronounced during slow (around or below six breaths per minute), relaxed breathing (i.e. breath-based meditation, and in some cases,  feelings of love/connection/appreciation/spirituality etc). Some researchers and biofeedback enthusiast have called this the ‘meditators peak’ since it becomes particularly pronounced during many forms of breath-based (tan-den) meditation techniques ie. yoga, zazen, chanting etc. This potential to create what David Rosenboom has called “attention dependant” structures continues to fascinate me – and I’m looking forward to seeing what I can build using these new tools.

Above: example of power spectrum density analysis display of heart rate variatibility (HRV), showing baseline recording (bottom), early stages of resonance (middle), and full resonant ‘meditators’ peak (top).

I’ve been pursuing this method for many years now – but for one reason or another I’ve never quite got there. Its one thing to obtain an frequency domain (i.e. spectrum) analysis of these heart rate patterns – but it’s another to extract data from this display that you can use for an interactive artwork/biofeedback display! There are plenty of applications that can perform these analyses – but currently none of them offer a way to transmit this data in realtime to other applications. In this project we are using OpenSoundControl to send and receive data between the analysis app and Max-MSP – which I’m using to transmit the sensor signal, and receive the completed spectrum and resonance analysis.

From the outset I also wanted the code to be as scalable as possible: able to be deployed across a wide variety of operating systems (Windows, OSX and iOS especially). The project required the students to familiarise themselves with a lot of new problems: OpenSoundControl inter-application networking and data transmission, bio-statistical analyses and C++.

The outcome is an application run from the  Terminal app that I can use to measure the amount of resonance in three separate bands of the heart rate variability spectrum. This will enable me to create interactions that respond to  very specific breath rates and affective (emotional/mental/attentive) states: providing not only a measure of resonance (measured as a percentage of total energy concentrated around a given frequency), but also the specific frequency of that resonance (that could be used to synchronise music, movements effects etc. in time with each person’s specific resonant frequency).

The code is still in ‘Alpha’ state, but I’ll upload a version of it soon, and it’s my plan to maintain it as an opensource project so that other interaction researchers can use it and maintain it.

Posted in software | Leave a comment

Exhibition at UTS DAB Lab, Nov 2th – 26th

In this exhibition, visitors are invited to interact with a collection of prototype iPhone and iPad ‘apps’ that translate changes in heart rate and skin temperature into colourful geometric artworks. These interactions have been designed by George as part of the Bright Hearts research project with Dr Angie Morrow, currently under way at The Children’s Hospital at Westmead.

The BrightHearts project is researching the potential of these meditative interactions for the management of pain and anxiety experienced by children undergoing painful, recurrent clinical procedures. Currently in its preliminary design-research phase, the next stage of the project will involve a pilot study and clinical trial in 2012-2013.


Above: visitors interacting with iPad and iPod prototypes as installed at DAB Lab Gallery. Photography by Paul Pavlou.

Above: examples of the iPod/iPad visuals developed for the UTS DAB Lab exhibition. Changes in heart rate are maped to colour using a simple, intuitive colour gradient – increases in heart rate make the colours warmers (i.e. yellow, orange, red) – decreases make it cooler (aqua, cyan, deep blue). The shapes expand when heart increases, and contract when heart rate decreases – to correspond with the sensations of inhalation and exhalation.

Presented by the Australian Network for Art and Technology (ANAT) in association with the Australian Government through the Australia Council for the Arts, its arts funding and advisory body, and supported by the James N. Kirby Foundation, The Children’s Hospital at Westmead – Kids Rehab Department, and University of Technology Sydney: Interaction Design & Human Practices (IDHuP) Lab, and Interactivation Studio.

Above: Dr Angie Morrow (Kids Rehab, Childrens Hospital at Westmead) and daughter Roisin, with George Khut at the exhibition launch, UTS, Sydney, 2011.

Posted in Uncategorized | Leave a comment

Wireless Pulse Sensing with Nonin PureSat, Arduino and Xbee

In previous projects I’ve been using a Wireless ECG heart Rate sensor produced by Vernier: a very reliable sensor that you hold in both hands, and that I have also adapted into a table top interface by embedding the four contact electrodes into the table surface (Distillery/Alembic & Retort, 2009).

With this project at Children’s Hospital Westmead, I wanted to explore ways to measure heart rate without requiring people to use both hands (which could be difficult, cumbersome, or impossible for some participants).

Pulse oximeters are a commonly used technology in most modern hospitals – that are used primarily to monitor blood oxygenation, but also heart rate. The basic principals are simple: shine an infra-red light into the finger/earlobe, and then measure how much of this light is absorbed by the (red) oxygen saturated blood as it rushes into the area beneath the sensor. Pulse oximetry is concerned with the amount of oxygen in the blood – and pulse-plethysmography is concerned with the actual pulsing siganl, so when its heart rate that is the main signal being measured, these sensors are refered to as IR pulse-plethysmographs, commonly abreviated to ‘PPG’ for short.

Whilst simple in theory, there are many factors that mean its often difficult to obtain a reliable and accurate signal with basic IR sensing technology: changes in ambient lighting conditions, movement artefacts, and poor perfusion (circulation), and a signal range that fluctuates with changes in blood pressure, complicating the process of interpreting ‘beats’ from a rhythmic but often highly variable signal.

Last year I started doing tests with a PPG sensor made/distributed by J&J Engineering: integrating it into an Arduino board and receiving its signal via an Xbee wireless serial port connection to Max-MSP. You can read more about this project, and check out the Arduino Code on my website. The Breath Sensor was brilliant, but the PPG heart rate sensor was still prone to missing beats and fluctuations in finger and blood pressure (i.e. how tight the finger clip was, and how much the signal varied with changes in blood pressure.

This year, after shopping around for professional pulse oximetry /PPG equipment I settled on using Nonin’s PureSAT and OEMIII products.

Above: Nonin OEMIII chip, used to obtain pulse data.

The OEMIII is a miniature circuit that is used to drive and transmit sensor data from the PureSAT oximeters. I use the OEMIII to perform the basic beat detection work, that can otherwise become quite challenging in real-world interaction contexts.  I commissioned Angelo Fraietta (Smart Controller) to developed a system that would enable us to interface the serial data from the OEMIII chip with Max-MSP, using Arduino and OpenSoundControl (OSC). I should also mention that these Nonon devices are TGA (Australian Therapeutic Goods Administration) approved medical devices – which simplifies some of the administrative and safety requirements we are obliged to comply with at the hospital.

Above: the Nonin PureSAT pulse oximeters – Ear clip (left) and Finger clip (8000SS). Photos by Julia Charles.

The next step was to design a housing for the electronics. Frank Maguire was commissioned to integrate Angelo’s initial prototype into an interface that could be used with an Apple iPhone/iPod interface. Ultimately, we’ll be embedding all the sensor hardware into a single plug/dongle type device that will plug directly into the iPhone/iPad, but for this preliminary prototyping stage our main criteria was that it should provide  a an experience of an all-in-one interaction: nestled in your palm, underneath the iPhone. I was also interested in a semi-modular system that could also be inserted into other objects (i.e. toys, night-lights, soft-sculptures etc.).

Posted in electronics, prototype, software, Uncategorized | Leave a comment

Position Paper for OZCHI

The BrightHearts Project: A New Approach to the Management of Procedure-Related Paediatric Anxiety

Dr George Poonkhin Khut, Dr Angie Morrow, Staff Specialist, and Dr Melissa YoguiWatanabe.

This is a position paper we’ve submitted for two workshop events at OZCHI in November-December 2011, in Canberra, ACT. OzCHI is Australia’s leading forum for work in all areas of Human-Computer Interaction. “OzCHI attracts an international community of practitioners, researchers, academics and students from a wide range of disciplines including user experience designers, information architects, software engineers, human factors experts, information systems analysts, social scientists and managers.” – http://www.ozchi.org/

Writing this position paper has provided a great opportunity to help place the project in the context of other research that has been done in the are of paediatric acute pain management and procedure-related anxiety. The next step will be to articulate the key design problems, issues and proposed methods that we’ll be working with, and how these relate to issues identified in this paper…

ABSTRACT

We survey existing approaches to the management of procedure-related pain and anxiety, including recent research utilising Virtual/Augmented Reality Distraction techniques, and then outline an approach that uses a biofeedback controlled interactive artwork as a focus for children to explore how they can regulate aspects of their psychophysiology (autonomic nervous system responses) through a combination of breath and attentive focus. Our research aims to assess the potential of small, portable biofeedback-based interactive artworks to mediate the perception and performance of the body in paediatric care: as experienced by children undergoing painful recurrent procedures.

KEYWORDS: Biofeedback, Biofeedback Assisted Relaxation Training, Paediatric, Acute, Pain, Distraction, Cognitive-Behavioural, Heart Rate, Temperature, Multimedia, Primary Care

Download the full paper here

Posted in medical research | Leave a comment

Researching Clinician Experiences

Melissa Yogui and I just had a meeting with Professor Toni Roberston at the UTS Interaction Design and Human Practices (IDHuP) Lab, to discuss some discussion groups we have scheduled with staff at Children’s Hospital Westmead. Toni has worked and published extensively on human-centred and participatory design methods, including work in the area of medicine and health care.

For these meetings our aim is to gain a deeper understanding of the working conditions in the clinics, basic logistics (how long a procedure takes, how much waiting is involved for clients, how many people in the room at a given time tec.) and the varieties of coping/non-coping styles they encounter in the work with children undergoing painful, recurrent procedures.

Here are some of Toni’s recommendations:

  1. Do a series of pilot meetings – to rehearse and test out our facilitation skills, the appropriateness of our agenda; and capacity to incorporate important follow up questions etc.;
  2. Send the participants an email a few days in advance – with mental ‘home work’ to prepare for the meeting – to think of a short story that they can tell us about a typical or atypical client interaction or situation in their clinic;
  3. Recruit a note taker to take notes during the meetings – as we will be too busy making sure it runs smoothly, and responding to answers with followup questions etc [not feasible at this stage as we would have had to include this info in our ethics application!].
  4. Get the  participants  to generate the success criteria that they would use to evaluate the success of the project (stakeholder-identified success criteria /KPI’s): “How would it be if this worked?”, “How would you know that this device was working?”;
  5. Use prototypes (cardboard/paper, electronic etc.) to establish a shared (concrete) language across disciplines (design, art, medical, psychological etc.);
  6. Send the participants a copy of the Personas and Scenarios we develop from stories they tell us – and invite them to provide if they ae interested – feedback re the accuracy or viability of these personas and scenariosas tools to help us in the design process.
  7. Think about which research questions can involve anyone (i.e. we all have bodies, we can have a response to an artwork etc.) ; and which questions can only be answered by the children, their famillies or the staff working with them – which questions are specific to the unique situations these people inhabit (that we may not). The point here is to make the most of what each specific group have to offer.
Posted in observation, participatory design | Leave a comment

Week 2 – Child-Life Therapy

This week I met with Child-Life Therapist Sandra Pengilly, who told me about the work that she and her staff do with kids as partt of the Child-Life Therapy Team at CHW.

Child-Life Therapists used to be called ‘Play Therapists’. They use human-centred approaches to helping children to understand and negotiate their experience at  hospital. Working with children, parents and clinical teams, they design and implement processes to empower children, helping them to manage the fear, pain and uncertainty that accompany serious illness and hospitalisation.

Child-Life Therapists work with children across the hospital, and at Kids Rehab, they are generally used only for children who show signs of extreme anxiety in relation to examinations and treatments. “Medical Play” is one processes they use – using role play with soft toys and to help children to model their concept of the treatment experience and develop a sense of agency in relation to their experiences at hospital.

Another important tool they use is the production of a special book for each child i.e. “My Botox Story” (a few pages from this included above) – that they give to the clinical team prior to each treatment. The booklet describes each stage of their treatment, and enables the child to make some choices about the way in which the procedure takes place. The clinical team then use this booklet to inform the treatment process – by giving children a sense of choice and control over a few simple things such as how many people are in the room at the start of the procedure (before the laughing gas), and whether they sit up or lay on their back (adults look bigger when you are lying on your back – and this can feel intimidating).

This use of rehearsal and reflection on experience (in Medical Play) interests me a great deal, and is something I’ve been exploring with The Heart Library Project, that I developed in 2008 in collaboration with Caitlin Newton-Broad, where we would invite people to represent some aspects of the experience they had just had with the work (as a reflection of their own physiology). What interests me about these processes of modelling and representing experience through imaginative play, illustration, and conversation is they way it integrates different experiences and values, embedding and re-framing knowledge/experience. Neurons that fire together wire together!

Posted in child-life therapy | Leave a comment

Week 1 – Clinical Observations, Botox

This week I observed Botox and Intrathecal Baclafyn pump change clinics at Kids Rehab. My aim in observing these clinics was firstly to gain some insights into the operating conditions experienced by people in these clinics, the narrative of the patients visit and staff interactions, and the ergonomic factors that I’d need to address (five adults gathered around a child/young person on the bench). During my residency I also plan to observe Dialysis and Burn Dressing clinics at the hospital.

The clients that I observed at these clinics represented a broad range of abilities from severe physical and intellectual disability, to relatively mild spaciticity, and people with severe physical disabilities without intellectual/developmental disabilities – who were able to communicate with their parent/carer via facial gestures etc. It was a very humbling experience to listen to and observe these children/young people and their parents/carers – many of whom have been using the clinic for many years.

There was also a diverse mix of ethnicities across the families visiting the clinic. Most of the patients I observed have been coming to these clinics for some time now (Botox injections are topped up every six months). As part of this observation process I documented the duration of each stage of the process, and made drawings describing the configuration of the furniture (treatment benches, chairs etc.) the room and the people in it, to help me get a sense of what could be possible as far as an interactive experience goes: how much time and space would the children/young people (and their carers) have to engage with the work that I am developing.

The Botox clinics take place in three stages:

  1. Consultation with medical team, client and their carer during which doctors and physios/Occupational Therapists observe, assess and discuss the movement abilities, and progress since last treatment – and identify and prepare the injection sites with local, topical anaesthetic (EMLA cream) – anywhere between 10-20 minutes;
  2. Waiting for the EMLA cream to take effect – usually done in the waiting room/reception area at Kids Rehab, watch videos, playing games, waiting with parents etc. – about an hour;
  3. The injection procedure itself , in which botox is injected into specific locations inside the muscles, using an ultra-sound device to monitor the location and depth of the injection – between 5 and 20 minutes.

On the whole, the impression I left with was that there was not a whole lot of room in these clinics (a lot of people all gathered around the child/young person being treated), and that what ever I design, would need to also accommodate clients with severe dystonia (involuntary movements of hands, arms, legs etc.) – which presents some interesting challenges as far as obtaining reliable heart-rate signals goes. The waiting room looked like one of the more interesting situations that I could introduce an interactive artwork into – where they would have time to explore, and an opportunity to prepare for their procedure.

Posted in botox, observation | Leave a comment