Building off a project from last year, a team from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) has developed a wearable device that allows visually impaired users to detect a range of obstacles as they walk through their environments.
The researchers call their system ALVU, for “Array of Lidars and Vibrotactile Units.” The wearable is contactless and hands-free, and is intended to be intuitive and discreet for users to wear.
“ALVU enables safe local navigation in both confined and open spaces by allowing the user to distinguish free space from obstacles,” says lead author Robert Katzschmann.
The device is made up of a sensor belt and a strap that vibrates to provide feedback to the user. Specifically, the belt uses an array of sensors worn around the waist, with pulses of infrared light that measure the distances between the user and surrounding obstacles. The strap communicates the measured distances through a variety of vibratory motors worn around the abdomen.
The team tested the system with 12 blind users, who did more than 150 trials in which they were able to successfully walk through hallways, avoided obstacles, and detected staircases.
Katzschmann co-authored an article about the system that appears in the most recent issue of the journal of IEEE in Transactions on Neural Systems & Rehabilitation Engineering (TNSRE). He is joined on the paper by CSAIL Director Daniela Rus and graduate student Brandon Araki.