News

Spotlighted News

Filter options
  • All
  • Articles
  • Videos
  • Talks
List view

More efficient lidar sensing for self-driving cars

If you see a self-driving car out in the wild, you might notice a giant spinning cylinder on top of its roof. That’s a lidar sensor, and it works by sending out pulses of infrared light and measuring the time it takes for them to bounce off objects. This creates a map of 3D points that serve as a snapshot of the car’s surroundings.

Forum examines promises and limits of AI in clinical medicine

The confluence of medicine and artificial intelligence stands to create truly high-performance, specialized care for patients, with enhanced precision diagnosis and personalized disease management. But to supercharge these systems we need massive amounts of personal health data, coupled with a delicate balance of privacy, transparency, and trust.

Articles

Forum examines promises and limits of AI in clinical medicine

The confluence of medicine and artificial intelligence stands to create truly high-performance, specialized care for patients, with enhanced precision diagnosis and personalized disease management. But to supercharge these systems we need massive amounts of personal health data, coupled with a delicate balance of privacy, transparency, and trust.

Videos

More efficient lidar sensing for self-driving cars

If you see a self-driving car out in the wild, you might notice a giant spinning cylinder on top of its roof. That’s a lidar sensor, and it works by sending out pulses of infrared light and measuring the time it takes for them to bounce off objects. This creates a map of 3D points that serve as a snapshot of the car’s surroundings.

Giving robots a sense of touch

Eight years ago, Ted Adelson’s research group at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) unveiled a new sensor technology, called GelSight, that uses physical contact with an object to provide a remarkably detailed 3-D map of its surface. Now, by mounting GelSight sensors on the grippers of robotic arms, two MIT teams have given robots greater sensitivity and dexterity. The researchers presented their work in two papers at the International Conference on Robotics and Automation last week.

Ingestible origami robot can patch wounds inside your stomach!

In experiments involving a simulation of the human esophagus and stomach, researchers at CSAIL, the University of Sheffield, and the Tokyo Institute of Technology have demonstrated a tiny origami robot that can unfold itself from a swallowed capsule and, steered by external magnetic fields, crawl across the stomach wall to remove a swallowed button battery or patch a wound.The new work, which the researchers are presenting this week at the International Conference on Robotics and Automation, builds on a long sequence of papers on origami robots from the research group of CSAIL Director Daniela Rus, the Andrew and Erna Viterbi Professor in MIT’s Department of Electrical Engineering and Computer Science.