A soft hand capable of robustly grasping and identifying objects using internal strain and force measurements as well as computer vision.

A soft hand that is capable of robustly grasping and identifying objects based on internal state measurements along with a system that autonomously performs grasps. The highly compliant soft hand allows for intrinsic robustness to grasping uncertainties. The addition of internal sensing allows the configuration of the hand and object to be detected.

The finger module includes resistive force sensors on the fingertips for contact detection and resistive bend sensors for measuring the curvature profile of the finger. The curvature sensors can be used to estimate the contact geometry. This capability allows to distinguish between a set of grasped objects. With one data point from each finger, the object can be identified by grasping it. A clustering algorithm to find the correspondence for each grasped object is presented for both enveloping grasps and pinch grasps. We further develop an object perception algorithm capable of localizing the pose of an object before and after grasping.

The soft hands can perform manipulation operations such as grasping and connecting two parts. The flexible soft hand grasps objects reliably under high uncertainty but the poses of the objects after grasping are subject to high uncertainty. Visual sensing through an RGB-D camera ameliorates the increased uncertainty by means of in-hand object localization.

The combination of soft hands and visual object perception enables the Baxter robot, augmented with soft hands, to perform object assembly tasks which require high precision. The effectiveness of our approach is validated by comparing it to rigid grippers.


Project members include:

Bianca Homberg, Robert Katzschmann, Mehmet Dogar, Joseph DelPreto, Daniela Rus

Research Areas


default headshot

Mehmet Dogar