- About CSAIL
- News + Events
- Alumni & Friends
studentsDaniel Zuo: Creative approaches to connectivity
Prepping a robot for its journey to Mars
Y. Bryce Kim PhD `17 wins NSF award
August 22 2016
We learn a lot about objects by manipulating them: poking, pushing, prodding, and then seeing how they react.
We obviously can’t do that with videos — just try touching that cat video on your phone and see what happens. But is it crazy to think that we could take that video and simulate how the cat moves, without ever interacting with the real one?
Researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) have recently done just that, developing an imaging technique called Interactive Dynamic Video (IDV) that lets you reach in and “touch” objects in videos. Using traditional cameras and algorithms, IDV looks at the tiny, almost invisible vibrations of an object to create video simulations that users can virtually interact with.