Expanding input modalities for Intel’s Assistive Tech Platform

The Assistive Context-Aware Toolkit (ACAT) is an open source platform developed by Intel Labs to enable people with motor neuron diseases and other disabilities to have full access to the capabilities and applications of their computers through very constrained interfaces suitable for their condition. More specifically, ACAT enables users to easily communicate with others through keyboard simulation, word prediction, and speech synthesis. Users can perform a range of tasks such as editing, managing documents, navigating the Web and accessing emails.

Demo Video demonstrating features :

Design Challenge:
1. Improving sensing infrastructure for Prof. Stephen Hawking :
Prof. Stephen Hawking is the first active user of ACAT. He controls ACAT interface using an analog proximity sensor that sits on his glasses to translate his cheek movements into a trigger for the system.


Screen Shot 2016-01-11 at 2.32.20 am

Fig 1: Prof. hawking using ACAT.

Increasing Prof. Hawking’s independence:
His range of motion or speed of motion changes quite a bit over the course of a day. With his current analog sensor, this requires manual adjustments to be done by staff to increase or reduce the sensitivity of the sensor. A major step towards increasing his independence would be to replace current sensor with a new digital infrared proximity sensor. This sensor would have adaptive algorithm to automatically adjust to change of thresholds when he is fatigued or when sensor position changes.

Reducing false triggers:
A measure of utility for a good interaction system is to minimize false positives, thus reducing time required to complete the task. False positives for Prof. Hawking arise from instances when he smiles, swallows saliva or eats food. Our sensing algorithm will detect such instances automatically, perform undo of the last selected key and freeze the user interface for some period of time till he gets back to normal.

2. Scalability :
Up till now majority of Intel’s research efforts have been focussed on the development of a graphical user interface which can cater to the day to day communication needs of a person with disability. There is little work done to study the specifics of type, form factor or position of sensor on body or the detection of trigger intent for controlling such sophisticated user interface.



Screen Shot 2016-01-11 at 3.05.34 am

Fig 2. : Women with Progressive Supernuclear Palsy(PSP)  demonstrating little movement left in her fingers.
Design Process :
As research intern at Intel Anticipatory Computing Lab, my focus was to work on second design problem. For this I worked with a person with Progressive Supernuclear Palsy .

1.EMG:
Even though the method worked to detect finger movement in healthy people it failed to robustly detect fast and slow finger movements. Also, it was neither comfortable to use nor resistant to tremors. Thus, it was rejected.
Screen Shot 2016-01-11 at 3.27.55 am
Fig 3: Experimenting with various positions of electrode to detect finger/ hand movement robustly

2. Soft Ring:

Screen Shot 2016-01-11 at 3.32.44 am  Screen Shot 2016-01-11 at 3.33.07 am

Due to the participant’s unique style of holding the finger continuously like a pinch when we went to observe her, we experimented with soft ring.

3. 3D printed ring with dynamically threshold to adjust to slow, fast finger movement as well as continuous change in hand position: We also tried to position IMU and proximity sensors on various places- leg, foot, hand, finger. Important factor for deciding position of sensor is repeatability of gesture.

Untitlejnjkd

Untitled

Future : Funded by Intel the project is being continued at Georgia Institute of Technology under Prof. Gregory D. Abowd. I am working on first design problem of improving sensing infrastructure for Prof. Hawking as well as scaling ACAT by doing user research and constantly iterating on sensing infrastructure.

Comments are closed.