As seen on TV: Best Paper Award at CHI 2020 for ACE-LP Publication

Nonspeaking individuals with motor disabilities often rely on typing on a computer with speech output in conversation with other people. However, even without a physical disability affecting the typing process, contributions by communication aid are too slow and error prone with typical typing rates of physically impaired person being between 5 and 20 words-per-minute. This is in contrast to the rate people speak, which is typically in the range of 100 and 140 words-per-minute. This difference in communication rate is referred to as the communication gap. This gap is currently very large, typically ranging between 80 and 135 words-per-minute and affects the quality of everyday life interactions for the users.

New research from the Universities of Dundee and Cambridge introduces a new context-aware method that has the potential to reduce this communication gap by eliminating between 50% and 96% of the keystrokes the person has to type to communicate. The method allows a person to quickly retrieve sentences they have typed in the past using artificial intelligence. Prior research has shown that people relying on speech synthesis to communicate tend to reuse previously typed sentences, a phenomenon observed in any personal storytelling, reusing sentences when retelling stories to new audiences. However, such retrieval or re-use in a communication aid has relied on time-consuming and tedious methods. In this new system, as the person is typing, the system automatically retrieves the most relevant previous sentences based on the text typed and the context the conversation the person is involved in. Context includes information about the conversation such as the location, time-of-day, and automatic identification of the speaking partner’s face and identify. An early prototype version of the system was demonstrated live in the TV show BBC Click.

Annalu Waller on stage with presenter for BBC Click Live
A prototype version of the system was demonstrated live on the TV show BBC Click.

The paper is the first to integrate context-aware information retrieval with speech-generating devices for people with motor disabilities, demonstrating how context-sensitive artificial intelligence can improve the lives of people with motor disabilities. It was published at CHI 2020, the world leading conference for Computer-Human Interaction research, where it won a Best Paper Award which is only awarded to the top 1% of accepted papers.