Students Develop Assistive Technologies

Bookmark and Share
Students Develop Assistive Technologies
A blind accessible coffee pot developed by PPAT students Marcus Lowe and Victoria Sun.
Lindsay is a 23-year-old coffee lover and researcher at MEEI who lives in Somerville. She has a Keurig Vue Brewer to allow her to make a wide variety of hot and cold beverages at the touch of a button; however, Lindsay has trouble using many features of the machine as she is blind and cannot see the touch screen used to control the coffee maker.
 
This past fall as part of course 6.S196: Principles and Practice of Assistive Technology (PPAT), two undergraduates, Marcus Lowe and Victoria Sun, teamed up with Lindsay in an effort to develop a piece of assistive technology that would allow her to use the many different features of her coffee maker. The goal of PPAT, taught by EECS Professor and CSAIL Principal Investigator Seth Teller and co-taught by EECS Professor and CSAIL Principal Investigator Rob Miller, is for small teams of students to work with clients in the Cambridge area to develop assistive technology - a device, piece of equipment, mobile application or other solution - that helps the client live more independently. The course was initially funded by the Alumni Class Funds.
 
“The class evolved naturally from my lab's research efforts to develop various types of assistive technology for people with disabilities, such as self-driving wheelchairs for people with MS, ALS, brain or spinal cord injuries, and wearable machine vision systems for blind and visually impaired people,” said Teller. “Given Professor Miller’s focus on human-computer interaction, and the fact that both of our labs attract UROPs (Undergraduate Research Opportunities Program at MIT) interested in the human side of technology, it was a natural step to create an undergraduate subject focusing on this area.”
 
At the close of the fall term, PPAT students presented the culmination of their efforts to fellow classmates and others, demonstrating a variety of new assistive technologies: accessible touch- and speech-based nurse calls for a client with MS; augmented caregiver access and E911 capability for a client with ALS; accessible tablet control of an adjustable bed for a client with MS; and a vibrating bracelet to notify a blind and hearing-impaired client of incoming calls on her mobile phone. During the final presentations on December 5, Lowe and Sun presented their work developing technology that helps Lindsay use her coffee machine.
 
To enable Lindsay to take full advantage of her coffee machine, Lowe and Sun developed a vision-based touchscreen interpretation system for Lindsay’s iPhone. As a frequent user of assistive technologies like VoiceOver, Lindsay was comfortable using the iPhone, thus Team Lindsay developed a system that could work in tandem with the technology she already used on her phone.
 
The system Lowe and Sun developed features a stationary stand where Lindsay can secure her iPhone above the touch screen that controls her coffee maker. To make a cup of coffee, her phone takes a picture of the touch screen and asks Lindsay what she would like to drink. The application then gives Lindsay oral guidance about how to use the touchscreen to make exactly the type of drink she wants. Using a grid made with thin strips of laminate to guide her fingers, Lindsay can navigate the screen, pressing the correct buttons. The application then reviews and confirms her selection.
 
“They did a great job,” said Lindsay of Lowe and Sun’s work. “One of the upsides to their solution is the ability to confirm the current state of the coffee maker screen. Now I know and can confirm what kind of drink I am going to make.”
 
Lowe and Sun put particular effort into enabling Lindsay to monitor the operation of the assistive technology system, by designing the program to repeat relevant commands as needed.. Eventually Lindsay can, if she wishes, memorize the sequences and forgo using the assistive technology altogether.
 
For their project, Priya Saha and Veronica Newlin worked with Haben, a third year law student at Harvard Law School, who is completely blind and partially hearing impaired. Like Lindsay, Haben uses quite a bit of assistive technology, including the iPhone’s VoiceOver feature. Haben asked Saha and Newlin for assistance in developing a method for notifying her of incoming calls and text messages through a discreet vibrating bracelet that could alert her of new calls. As she cannot hear or see her iPhone ringing, Saha and Newlin explained that Haben’s strategy is to check her phone every hour for calls and texts, meaning she misses about 90% of all incoming calls.
 
Saha and Newlin developed a custom-built bracelet housing a small vibrating motor that would notify Haben of incoming calls on her iPhone. The team used a Bluetooth to connect the iPhone with the bracelet’s motor for call notifications, and designed a custom-built motor driver specifically to function as the bracelet’s notification system.
 
Saha and Newlin designed the motor to vibrate for four seconds when Haben receives an incoming call, and housed the device in a metal bracelet so that the device would look like a piece of jewelry. While the team had trouble with notifying Haben of incoming text messages due to restrictions on accessing Apple’s messaging center, they estimate that the rate of success for notification of incoming calls using their system is between 90 and 100%.
 
Teller and Miller were impressed with Saha and Newlin’s work, in particular their resourcefulness in solving technical aspects of the problem Haben was facing and how well they got to know their user, which resulted in a specialized product that suited Haben’s needs.
 
For more information on PPAT, please visit: http://courses.csail.mit.edu/PPAT.

Abby Abazorius, CSAIL