App detects light for the visually impaired

CSAIL grad student helps develop iOS app that detects light levels using sounds and vibrations.
CSAIL grad student helps develop iOS app that detects light levels using sounds and vibrations.

Bookmark and Share

We humans rely on light for a profoundly wide range of activities, from aiding circadian rhythms, to regulating mental health, to the simple task of figuring out if our Wi-Fi router is on.

But for those who are blind or visually impaired, locating sources of light can be a difficult process.

Fortunately, a CSAIL PhD has helped develop a free iPhone app that can detect levels of light and interpret the information using sounds and vibrations. “Boop Light Detector” allows visually impaired users to be able to identify if a light is on or off.  Boop covers a large bandwidth, such as sensing daylight through open windows, and checking whether the power button is on for a computer.

“An effective robotic household helper needs to be capable of manipulating objects,” says co-developer Ariel Anders, whose main work at CSAIL focuses on using machine learning techniques for robotic manipulation. “Jonathan, our client, wanted his product to be an app so other people could eventually benefit from our project.  We went along with his vision and now we are releasing Boop to anyone with an iPhone for free.”

Anders, Preston and teammate Areej Al-Wabi first demonstrated the app in the 2016 MIT Assistive Technology Hackathon, where they received an honorable mention for their work. During the 10-hour competition, students built assistive technologies for Boston residents in need. The group worked alongside project sponsor Jonathan Gale, a completely blind Massachusetts resident, with support from MIT and Lincoln Laboratory volunteers.

Here’s how it works: once the app is turned on and the camera is enabled, a stream of three vibration pulses signal that it's working. The camera takes a live video stream of its location, and the app then analyzes each video frame's pixels to determine its exposure and sensitivity. If the pixels are bright enough to suggest a light source, the phone can either emit a sound or vibrate.

The app is structured where the display of video capture is on the top of the screen, the reading light is in the middle, and a toggle vibrate button is on the bottom. The single-screen model simultaneously tracks light levels and sends fast, audible feedback, and data can be transmitted intermittently.

Boop uses iOS built-in accessibility features with voice-over, and makes use of the “two-finger double tap” to exit the app. The vibrate feature can also be made to vary to indicate location and light intensity.

You can find it here: https://itunes.apple.com/us/app/boop-light-detector/id1134857212?ls=1&mt=8