left image

EECS

Taking the grunt work out of Web development



CSAIL researchers have helped reconstruct audio signals by analyzing minute vibrations of objects depicted in video.
CSAIL researchers have helped reconstruct audio signals by analyzing minute vibrations of objects depicted in video.
Hearing by seeing: “visual mic” uses potato-chip bag to recover sound from video footage

Researchers at MIT CSAIL, Microsoft, and Adobe have developed an algorithm that can reconstruct an audio signal by analyzing minute vibrations of objects depicted in video. In one set of experiments, they were able to recover intelligible speech from the vibrations of a potato-chip bag photographed from 15 feet away through soundproof glass.


 

In other experiments, they extracted useful audio signals from videos of aluminum foil, the surface of a glass of water, and even the leaves of a potted plant. The researchers will present their findings in a paper at this year’s Siggraph, the premier computer graphics conference.

Bookmark and Share