Project OpenDR and the future of human-robot interaction

Project OpenDR which PAL Robotics is a project partner in has released a toolkit for deep learning in robotics – one of the first of its kind, that provides over 20 methods for human pose estimation, face detection, face recognition, facial expression recognition, object detection and more. The aim is to enable a wide range of capabilities in robotics more easily using the toolkit.

Advances in deep learning have brought about significant developments in technology such as self-driving cars and algorithms that are able to understand and answer questions. However, the application of deep learning in robotics creates challenges in learning, reasoning, and embodiment problems and research questions that are often not addressed by the computer vision and machine learning communities. 

The deep learning toolkit has been in development as part of the EU project OpenDR which has the following objectives:

  • To provide a modular, open, and non-proprietary toolkit for core robotic functionalities enabled by lightweight deep learning
  • To leverage AI and Cognition in robotics: from perception to action
  • To propose a co-integration of simulation and learning methodology for deep learning in robotics and demonstrate the potential in application areas
  • To establish strong links to robotics Digital Innovation Hubs.

Deep Learning toolkit capabilities including Speech Command Recognition and Heart Anomaly Detection

The team at PAL Robotics will use multiple capabilities developed in the OpenDR toolkit on the TIAGo robot. These include:

  • Pose Estimation
  • 2D Object Detection
  • Face Detection
  • Panoptic Segmentation
  • Face Recognition
  • Semantic Segmentation
  • RGBD Hand Gesture Recognition
  • Heart Anomaly Detection
  • Video Human Activity Recognition
  • Landmark-based Facial Expression Recognition
  • Skeleton-based Human Action Recognition
  • Speech Command Recognition
  • Voxel Object Detection 3D
  • AB3DMOT Object Tracking 3D
  • FairMOT Object Tracking 2D
  • Deep Sort Object Tracking 2D

For users, the toolkit will help to implement each of these capabilities more easily in various types of projects, even with no specific knowledge about deep learning.

Use cases in healthcare, agriculture, and production in project OpenDR

The OpenDR project includes a number of robotics use cases, which will make use of the Deep Learning toolkit in practice. 

PAL Robotics’ TIAGo robot is taking part in the healthcare use case, which has the aim of testing various capabilities in order to see their potential for future healthcare. These include object detection and identification detection of objects in daily life (for example bottles of water, clothes, drawers, chairs, and tables), person detection (for example recognising the identity of people appearing in a scene), and speech recognition of basic action commands (for example assigning tasks to the robot using simple speech commands.)

In addition to healthcare, the project’s agricultural use cases will make use of the toolkit in situations including detection of crop plants in a field, animal detection, and scene understanding (including different types of surfaces in agriculture).

Finally, OpenDR includes use cases in production – making use of capabilities such as vision-based human detection (detection of human presence and/or their activity and/or their gestures), learning robot motions or tasks from demonstration, and object manipulation for assembly tasks (for example pick-and-place and tracking). Read more about OpenDR, the project use cases and project partners in our previous blog post.

The OpenDR Deep Learning toolkit and the future 

The goal is that the open-source toolkit will enable a series of capabilities, such as ‘open drawers’ or ‘bring a bottle of water,’ to be remembered and activated with ease, improving Human-robot Interaction in research and increasing the application of robotics in our day-to-day lives. 

At PAL Robotics, there may be a future option of integrating some of these capabilities within PAL Robotics’ TIAGo and or ARI robots. Both TIAGo and ARI include the option of the NVIDIA Jetson TX2 add-on that offers the computational power for many perception algorithms based on Deep Learning. Jetson TX2 is a power-efficient embedded AI computing device that includes 8GB of memory.

How to use the OpenDR toolkit 

OpenDR provides an easy-to-use Python interface, a C API for selected tools, a wealth of usage examples and supporting tools, as well as ready-to-use ROS nodes. 

A set of data generation utilities, a hyperparameter tuning tool, and a framework to easily apply RL both in simulation and real robotics applications are also included.

All methods and their parameters are thoroughly documented, demonstration examples are available to showcase their functionality, and continuous integration tests ensure both the consistency of the code and that no conflicts arise between the different tools. 

At the same time, OpenDR is built to support Webots, Open Source Robot Simulator, while it also extensively follows industry standards, such as the ONNX model format. 

You can download the toolkit through GitHub, pip, and Docker Hub. 

TIAGo the mobile manipulator used in OpenDR

PAL Robotics’ versatile TIAGo platform (which stands for Take It And Go) is ready to adapt to meet your research needs and not the other way around. TIAGo includes perception abilities such as facial recognition, emotion recognition, and people tracking. TIAGo has autonomous navigation and localization, mapping and obstacle avoidance, all accessed through ROS. For manipulation, TIAGo’s arm has a large manipulation workspace, thanks to the lifting torso, being able to reach the ground as well as high shelves. The end effector can be either a gripper or a humanoid hand and they both can be quickly exchanged.

Discover how TIAGo was used for virtual hugs.

To find out more about PAL Robotics and the different TIAGo customizations available including TIAGo Titanium, TIAGo Steel, TIAGo Iron and TIAGo ++, visit our website, you can also create your own TIAGo, using your chosen configurations, find out more here. Don’t hesitate to get in touch with our experts to discuss how TIAGo can meet your research needs.