TY - GEN
T1 - CaTGrasp
T2 - 39th IEEE International Conference on Robotics and Automation, ICRA 2022
AU - Wen, Bowen
AU - Lian, Wenzhao
AU - Bekris, Kostas
AU - Schaal, Stefan
N1 - Funding Information:
2Rutgers University in NJ, USA. {bw344, kostas.bekris}@cs.rutgers.edu. Bowen Wen and Kostas Bekris were partially supported by the US NSF Grant IIS-1734492. The opinions expressed here are of the authors and do not reflect the views of the sponsor.
Publisher Copyright:
© 2022 IEEE.
PY - 2022
Y1 - 2022
N2 - Task-relevant grasping is critical for industrial assembly, where downstream manipulation tasks constrain the set of valid grasps. Learning how to perform this task, however, is challenging, since task-relevant grasp labels are hard to define and annotate. There is also yet no consensus on proper representations for modeling or off-the-shelf tools for performing task-relevant grasps. This work proposes a framework to learn task-relevant grasping for industrial objects without the need of time-consuming real-world data collection or manual annotation. To achieve this, the entire framework is trained solely in simulation, including supervised training with synthetic label generation and self-supervised, hand-object interaction. In the context of this framework, this paper proposes a novel, object-centric canonical representation at the category level, which allows establishing dense correspondence across object instances and transferring task-relevant grasps to novel instances. Extensive experiments on task-relevant grasping of densely-cluttered industrial objects are conducted in both simulation and real-world setups, demonstrating the effectiveness of the proposed framework. Code and data are available at https://sites.google.com/view/catgrasp.
AB - Task-relevant grasping is critical for industrial assembly, where downstream manipulation tasks constrain the set of valid grasps. Learning how to perform this task, however, is challenging, since task-relevant grasp labels are hard to define and annotate. There is also yet no consensus on proper representations for modeling or off-the-shelf tools for performing task-relevant grasps. This work proposes a framework to learn task-relevant grasping for industrial objects without the need of time-consuming real-world data collection or manual annotation. To achieve this, the entire framework is trained solely in simulation, including supervised training with synthetic label generation and self-supervised, hand-object interaction. In the context of this framework, this paper proposes a novel, object-centric canonical representation at the category level, which allows establishing dense correspondence across object instances and transferring task-relevant grasps to novel instances. Extensive experiments on task-relevant grasping of densely-cluttered industrial objects are conducted in both simulation and real-world setups, demonstrating the effectiveness of the proposed framework. Code and data are available at https://sites.google.com/view/catgrasp.
UR - http://www.scopus.com/inward/record.url?scp=85136333090&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85136333090&partnerID=8YFLogxK
U2 - 10.1109/ICRA46639.2022.9811568
DO - 10.1109/ICRA46639.2022.9811568
M3 - Conference contribution
AN - SCOPUS:85136333090
T3 - Proceedings - IEEE International Conference on Robotics and Automation
SP - 6401
EP - 6408
BT - 2022 IEEE International Conference on Robotics and Automation, ICRA 2022
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 23 May 2022 through 27 May 2022
ER -