Return to search

Imitation Learning of Whole-Body Grasps

Humans often learn to manipulate objects by observing other people. In much the same way, robots can use imitation learning to pick up useful skills. A system is detailed here for using imitation learning to teach a robot to grasp objects using both hand and whole-body grasps, which use the arms and torso as well as hands. Demonstration grasp trajectories are created by teleoperating a simulated robot to pick up simulated objects. When presented with a new object, the system compares it against the objects in a stored database to pick a demonstrated grasp used on a similar object. Both objects are modeled as a combination of primitives—boxes, cylinders, and spheres—and by considering the new object to be a transformed version of the demonstration object, contact points are mapped from one object to the other. The best kinematically feasible grasp candidate is chosen with the aid of a grasp quality metric. To test the success of the chosen grasp, a full, collision-free grasp trajectory is found and an attempt is made to execute in the simulation. The implemented system successfully picks up 92 out of 100 randomly generated test objects in simulation. / Singapore-MIT Alliance (SMA)

Identiferoai:union.ndltd.org:MIT/oai:dspace.mit.edu:1721.1/30251
Date01 1900
CreatorsHsiao, Kaijen, Lozano-Pérez, Tomás
Source SetsM.I.T. Theses and Dissertation
LanguageEnglish
Detected LanguageEnglish
TypeArticle
Format423043 bytes, application/pdf
RelationComputer Science (CS)

Page generated in 0.0022 seconds