News

Spotlighted News

Algorithms & Theory , AI & ML , Robotics , Energy , Transportation
Algorithms & Theory , AI & ML , Robotics , Energy , Transportation

One autonomous taxi, please

Filter options
  • All
  • Articles
  • Videos
  • Talks
List view

More efficient lidar sensing for self-driving cars

If you see a self-driving car out in the wild, you might notice a giant spinning cylinder on top of its roof. That’s a lidar sensor, and it works by sending out pulses of infrared light and measuring the time it takes for them to bounce off objects. This creates a map of 3D points that serve as a snapshot of the car’s surroundings.

Articles

Videos

More efficient lidar sensing for self-driving cars

If you see a self-driving car out in the wild, you might notice a giant spinning cylinder on top of its roof. That’s a lidar sensor, and it works by sending out pulses of infrared light and measuring the time it takes for them to bounce off objects. This creates a map of 3D points that serve as a snapshot of the car’s surroundings.

Giving robots a sense of touch

Eight years ago, Ted Adelson’s research group at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) unveiled a new sensor technology, called GelSight, that uses physical contact with an object to provide a remarkably detailed 3-D map of its surface. Now, by mounting GelSight sensors on the grippers of robotic arms, two MIT teams have given robots greater sensitivity and dexterity. The researchers presented their work in two papers at the International Conference on Robotics and Automation last week.