High-resolution Tactile Sensing for Reactive Robotic Manipulation

Speaker

MIT EECS

Host

Alberto Rodriguez
MIT MechE
Abstract: This thesis explores tactile sensing development to enable closed-loop reactive behavior in robotic manipulation. More specifically, we focus on developing high-resolution vision-based tactile sensing hardware, perceptual algorithms, and controller designs for robotic manipulation. Tactile sensing plays a key role in human manipulation. However, the existing artificial tactile sensors for robotics hands have multiple limitations in terms of form factor, robustness, and sparse measurement. Tactile sensors are rarely integrated into the current robotic manipulation systems, because of the limitations in sensing hardware and software.  We design new vision-based tactile sensors that capture the contact surface with high-resolution images and reconstruct the 3D geometry of the contact surface with the photometric stereo algorithm. We firstly design a new GelSight sensor that improves the accuracy of the depth map reconstruction and simplifies the fabrication process. To further optimize the form factor and enhance the robustness, we designed another vision-based tactile sensor, called GelSlim, which keeps the high-resolution sensing output but has a slimmer former, sharper tip, and high robustness. Based on the new sensor, we propose algorithms to distill useful contact information from the raw signal output. The key challenge here is connecting the contact geometry directly observed from the raw image to contact signals that have meanings in the context of contact mechanics, e.g., contact forces, contact slip. We use an image processing algorithm to track the gel deformation and compare it with a rigid body motion to detect incipient slip. We deploy an inverse Finite Element Method (iFEM) to reconstruct the contact force distribution based on the measurement of the 3D gel deformation. Finally, we explore how can the tactile signals be fed into the control loop in real manipulation tasks. We choose 2 representative contact rich manipulation tasks that can benefit from tactile control: cable following and object insertion. We implement the cable following by sensing & controlling both the state of the grasp of the cable and its configuration in real-time to allow smooth sliding of the fingers along the cable. We apply the strategy of controlling both the contact state and object state to the object insertion task. We train a general tactile-based RL insertion policy in an end-to-end fashion to align the object pose with the insertion hole and keep sticking contact of the grasp by detecting incipient slip during the contact exploration. The insertion system works for unknown objects and we also prove the tactile signals are more informative compared to force-torque signals for the insertion task.