sorting.png

HEAP is a research project funded by Chist-Era that investigates Robot Manipulation Algorithms for Robotic Heap Sorting. This project will provide scientific advancements for benchmarking, object recognition, manipulation and human-robot interaction. We focus on sorting a complex, unstructured heap of unknown objects –resembling nuclear waste consisting of a set of broken deformed bodies– as an instance of an extremely complex manipulation task. The consortium aims at building an end-to-end benchmarking framework, which includes rigorous scientific methodology and experimental tools for application in realistic scenarios. Benchmark scenarios will be developed with off-the-shelf manipulators and grippers, allowing to create an affordable setup that can be easily reproduced both physically and in simulation. We will develop benchmark scenarios with varying complexities, i.e., grasping and pushing irregular objects, grasping selected objects from the heap, identifying all object instances and sorting the objects by placing them into corresponding bins. We will provide scanned CAD models of the objects that can be used for 3D printing in order to recreate our benchmark scenarios. Benchmarks with existing grasp planners and manipulation algorithms will be implemented as baseline controllers that are easily exchangeable using ROS.

The ability of robots to fully autonomously handle dense clutters or a heap of unknown objects has been very limited due to challenges in scene understanding, grasping, and decision making. Instead, we will rely on semi-autonomous approaches where a human operator can interact with the system (e.g. using tele-operation but not only) and giving high-level commands to complement the autonomous skill execution. The amount of autonomy of our system will be adapted to the complexity of the situation. We will also benchmark our semi-autonomous task execution with different human operators and quantify the gap to the current SOTA in autonomous manipulation. Building on our semi-autonomous control framework, we will develop a manipulation skill learning system that learns from demonstrations and corrections of the human operator and can therefore learn complex manipulations in a data-efficient manner. To improve object recognition and segmentation in cluttered heaps, we will develop new perception algorithms and investigate interactive perception in order to improve the robot’s understanding of the scene in terms of object instances, categories and properties.

Details

  • Project Start Date: March 31st, 2019
  • Project End Date: March 30th, 2023
  • Coordinator: Ayse Kucukyilmaz, University of Nottingham
  • Chist-Era Call: ORMR