Locked-In Syndrome Communication
How might we design an adaptive + affordable communication system for people with severe motor disabilities?
Assistive communication systems promise a voice to people facing significant physical constraints.
But existing commercial solutions are often expensive, and unable to recognize & adapt to peoples’ changing conditions and needs.
To address this, we worked with two families in San Diego County to prototype and evaluate the design of an adaptive & affordable multi-modal communication system — one that recognizes both eye gaze and residual body movements as input. Together, we created a functional prototype made with off-the-shelf commercial equipment & novel software, all for less than $1000.
Who: Nadir Weibel, Colleen Emmenegger, Bonnie Chinh, Angie Nguyen, Jacob T Browne, Mary Perkins
When: June 2016 – March 2017
Where: Human-Centered & Ubiquitous Computing Lab, UC San Diego
In collaboration with two families in San Diego
Methodologies Used: Ethnography of assistive + augmentative communication device usage in home and outdoor environments, multi-modal video coding, competitive analysis, interviews, surveys, usability studies for interface design
Tools: Commercial-grade eye trackers (e.g. Tobii 4C, Tobii EyeX), Tobii DynaVox I-12+, ChronoViz (~ TechSmith Morae), iMovie, GoPro cameras
Approach: People with severe motor disabilities are not the only primary users of assistive communication devices — often their loved ones and caretakers play an integral role in the setup, cleanup, and continued use of the system. A system that is customizable and adaptable to the fluctuating needs and conditions of the person with severe motor disabilities must also consider the abilities and needs of their family and support network.
In the summer of 2016, we met with two families in San Diego County to conduct an ethnography of the everyday challenges that arise from severe motor disabilities. We sought to understand their communication needs around the household and in the public sphere. This includes not only understanding and designing for their daily routines, but also conducting a competitive analysis with the devices that were currently in use by the families. Part of the ethnography includes a survey of the current assistive communication device
Then, starting in the fall of 2016, we shifted towards participatory design and usability testing for potential multi-modal interfaces (e.g. using eye gaze in combination with residual thumb and hand movement). We worked with families to brainstorm and prototype everyday solutions. For soliciting feedback from people with locked-in syndrome, this meant asking a series of binary yes-and-no questions, to which the person could look up to say “yes” and look down to say “no”. Family members were nearby to help interpret their responses and resolve any ambiguities.
Opportunity: We learned that many existing alternative communication solutions for people with severe motor disabilities do not take advantage of a persons’ multi-modal capabilities (e.g. residual movement in the right thumb), and are instead focused on one modality alone (often eye-gaze). This leads to complications like the Midas Touch (when the modality for selection and scanning are one and the same, leading to accidental “button presses”), and missed opportunities in letting the individual convey their intentions and thoughts.
Role: I was responsible for setting up research protocols, surveys, video recording, and qualitative analysis during the eye tracking usability studies. I also took the lead as the main ethnographer for one of the families we were working with.
Result: We presented the families with a functioning prototype of an adaptive communication system using off-the-shelf equipment (e.g. Microsoft Surface, Tobii EyeX/4C
We are also preparing a paper for journal submission.