News

Spotlighted News

Filter options
  • All
  • Articles
  • Videos
  • Talks
List view

Giving robots a sense of touch

Eight years ago, Ted Adelson’s research group at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) unveiled a new sensor technology, called GelSight, that uses physical contact with an object to provide a remarkably detailed 3-D map of its surface. Now, by mounting GelSight sensors on the grippers of robotic arms, two MIT teams have given robots greater sensitivity and dexterity. The researchers presented their work in two papers at the International Conference on Robotics and Automation last week.

Danielle Olson: Building empathy through computer science and art

Communicating through computers has become an extension of our daily reality. But as speaking via screens has become commonplace, our exchanges are losing inflection, body language, and empathy. Danielle Olson ’14, a first-year PhD student at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL), believes we can make digital information-sharing more natural and interpersonal, by creating immersive media to better understand each other’s feelings and backgrounds.

Cinematography on the fly

In recent years, a host of Hollywood blockbusters — including “The Fast and the Furious 7,” “Jurassic World,” and “The Wolf of Wall Street” — have included aerial tracking shots provided by drone helicopters outfitted with cameras. Those shots required separate operators for the drones and the cameras, and careful planning to avoid collisions. But a team of researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) and ETH Zurich hope to make drone cinematography more accessible, simple, and reliable.

Voice control everywhere

The butt of jokes as little as 10 years ago, automatic speech recognition is now on the verge of becoming people’s chief means of interacting with their principal computing devices. In anticipation of the age of voice-controlled electronics, MIT researchers have built a low-power chip specialized for automatic speech recognition. Whereas a cellphone running speech-recognition software might require about 1 watt of power, the new chip requires between 0.2 and 10 milliwatts, depending on the number of words it has to recognize.

Articles

Danielle Olson: Building empathy through computer science and art

Communicating through computers has become an extension of our daily reality. But as speaking via screens has become commonplace, our exchanges are losing inflection, body language, and empathy. Danielle Olson ’14, a first-year PhD student at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL), believes we can make digital information-sharing more natural and interpersonal, by creating immersive media to better understand each other’s feelings and backgrounds.

Cinematography on the fly

In recent years, a host of Hollywood blockbusters — including “The Fast and the Furious 7,” “Jurassic World,” and “The Wolf of Wall Street” — have included aerial tracking shots provided by drone helicopters outfitted with cameras. Those shots required separate operators for the drones and the cameras, and careful planning to avoid collisions. But a team of researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) and ETH Zurich hope to make drone cinematography more accessible, simple, and reliable.

Voice control everywhere

The butt of jokes as little as 10 years ago, automatic speech recognition is now on the verge of becoming people’s chief means of interacting with their principal computing devices. In anticipation of the age of voice-controlled electronics, MIT researchers have built a low-power chip specialized for automatic speech recognition. Whereas a cellphone running speech-recognition software might require about 1 watt of power, the new chip requires between 0.2 and 10 milliwatts, depending on the number of words it has to recognize.

Videos

Giving robots a sense of touch

Eight years ago, Ted Adelson’s research group at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) unveiled a new sensor technology, called GelSight, that uses physical contact with an object to provide a remarkably detailed 3-D map of its surface. Now, by mounting GelSight sensors on the grippers of robotic arms, two MIT teams have given robots greater sensitivity and dexterity. The researchers presented their work in two papers at the International Conference on Robotics and Automation last week.