News

Spotlighted News

Algorithms & Theory , AI & ML , Robotics , Energy , Transportation
Algorithms & Theory , AI & ML , Robotics , Energy , Transportation

One autonomous taxi, please

Filter options
  • All
  • Articles
  • Videos
  • Talks
List view

More efficient lidar sensing for self-driving cars

If you see a self-driving car out in the wild, you might notice a giant spinning cylinder on top of its roof. That’s a lidar sensor, and it works by sending out pulses of infrared light and measuring the time it takes for them to bounce off objects. This creates a map of 3D points that serve as a snapshot of the car’s surroundings.

Articles

Cinematography on the fly

In recent years, a host of Hollywood blockbusters — including “The Fast and the Furious 7,” “Jurassic World,” and “The Wolf of Wall Street” — have included aerial tracking shots provided by drone helicopters outfitted with cameras. Those shots required separate operators for the drones and the cameras, and careful planning to avoid collisions. But a team of researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) and ETH Zurich hope to make drone cinematography more accessible, simple, and reliable.

Videos

More efficient lidar sensing for self-driving cars

If you see a self-driving car out in the wild, you might notice a giant spinning cylinder on top of its roof. That’s a lidar sensor, and it works by sending out pulses of infrared light and measuring the time it takes for them to bounce off objects. This creates a map of 3D points that serve as a snapshot of the car’s surroundings.

Giving robots a sense of touch

Eight years ago, Ted Adelson’s research group at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) unveiled a new sensor technology, called GelSight, that uses physical contact with an object to provide a remarkably detailed 3-D map of its surface. Now, by mounting GelSight sensors on the grippers of robotic arms, two MIT teams have given robots greater sensitivity and dexterity. The researchers presented their work in two papers at the International Conference on Robotics and Automation last week.

First-ever 3-D printed robots made of both solids and liquids

One reason we don’t yet have robot personal assistants buzzing around doing our chores is because making them is hard. Assembling robots by hand is time-consuming, while automation — robots building other robots — is not yet fine-tuned enough to make robots that can do complex tasks.But if humans and robots can’t do the trick, what about 3-D printers?In a new paper, researchers at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) present the first-ever technique for 3-D printing robots that involves printing solid and liquid materials at the same time.The new method allows the team to automatically 3-D print dynamic robots in a single step, with no assembly required, using a commercially-available 3-D printer.