AI applications: at the core of our robots and collaborative projects

Artificial Intelligence is key for boosting interactions between robots and humans, and for the more widespread deployment of service robotics in society. With the help of AI, robots can gain increased autonomy and improve their capabilities to help in all kinds of activities, through abilities such as recognizing and learning how to interact with humans, detecting and grasping objects, and managing their navigation paths more easily. At PAL Robotics, AI is used by our robots in a whole range of projects, and is an active part of our research for boosting HRI, read on to find out more about some of our applications and our latest research in the field. 

Enabling service robots to offer smoother, more natural interactions with their users is essential to increase their acceptability. Natural interaction is however challenging to achieve and requires a combination of complex social skills. 

AI is at the core of our robots and collaborative projects and helps robots with the following capabilities:

  • facial recognition
  • emotion recognition
  • people tracking
  • fall detection
  • speech recognition
  • environment understanding
  • object recognition
  • control of manipulation of objects in case of uncertainty

At PAL Robotics, we equip our robots with dedicated NVIDIA GPUs for edge computing, as well as continuously developing various AI applications in our robots, many of which are part of EU research projects to enable new use cases. Here are some examples:

Modelling interaction with humans in the PERSEO project

Project PERSEO (European Training Network on PErsonalized Robotics as SErvice Oriented applications project), aims to train a new generation of researchers and professionals to face research challenges in the market of personal robots. Personal robotics presents several research challenges mainly related to the need for a high degree of personalization of robot behaviour for the user’s needs and preferences.

PERSEO’s research program is organised into three research themes aimed at investigating personalization of robot capabilities at different levels of possible human-robot interaction, in particular “Physical”, “Cognitive”, and “Social”. This requires research skills ranging from computer science and AI to automation, ethics, and psychology.

Our colleague Lorenzo Ferrini is a PhD student, funded by the PERSEO project. He specifically works on how robots can learn from humans, using AI techniques like semantic modelling and reinforcement learning. He teaches robots (our mobile manipulator robot TIAGo and ARI robot) to focus their attention on important demonstrations, also combining this with language for robots to learn how to best help.

AI applications in ambient assisted living for the SHAPES project

SHAPES stands for “Smart and Healthy Ageing through People Engaging in supportive Systems” – the project has the goal of creating an environment that supports the deployment of digital solutions to support healthy and independent living for older individuals

As part of this project, PAL Robotics and partners have adapted the social robot ARI to help with general wellbeing and to engage in social activities. Robot ARI is designed to recognise faces, and following this provides a personal interaction to users taking part in the pilot, such as reminding them to take their medications, reminding them of family events, and offering a series of games designed by specialists to improve memory and cognitive functions, in addition to alerting caregivers/medical personnel of any issues.

During this project, ARI has performed tasks involving AI, Machine Learning, and Deep Learning, such as facial recognition, emotion recognition, people tracking, fall detection, speech recognition, environment understanding, object recognition, object tracking, and control of manipulation of objects.

AI applications in healthcare in project SPRING

The SPRING project aims to enable robust robot perception in complex, unstructured, and populated spaces, and enables sensor-based and knowledge-based robot actions for multi-modal and multi-person interaction and communication. We also validate the technology based on the needs of gerontological healthcare (a hospital environment).

For the SPRING project, project partners are working with PAL Robotics’ social robot ARI, as well as integrating their own software developments.

Robot ARI has the abilities of face recognition, natural language processing, and expressive gaze, which makes the robot a suitable tool for human assistance. ARI’s tasks in the hospital environment include welcoming patients and visitors in the waiting room, helping with check-in/out forms, assisting with medical appointments, directing patients to their appointments, and providing entertainment. 

Recent research papers in HRI and social robotics 

Demonstration of state-of-the-art Social Intelligence on ARI and TIAGo platforms

ROS4HRI is a novel set of ROS-based modules that enables the creation of complex, interoperable pipelines for human-robot interaction, integrating heterogeneous AI techniques into a modular framework, and offers for the first time a standardised approach to build complex perception pipelines for human awareness. We will soon publish a research paper, written by PAL Robotics’ Séverin Lemaignan, Lorenzo Ferrini, Raquel Ros, and Luca Marchionni, which talks about the novel ROS4HRI open-source framework for Social Intelligence on our social humanoid robots – ARI and TIAGo.

How do Consumers’ Gender and Rational Thinking Affect the Acceptance of Entertainment Social Robots?

PAL Robotics’ Antonio Andriella worked on this paper about about how the rapid aging of the population, a longer life expectancy, and older people’s desire to live independently has boosted the demand for companion and entertainment social robots. Based on technology acceptance models, a parsimonious model is proposed to estimate the intention to use this new advanced social robot technology and, in addition, an analysis is performed to determine how consumers’ gender and rational thinking condition the precedents of the intention to use this new advanced social robot technology.

Short-Term Human-Robot Interaction Adaptability in Real-World Environments

This paper has been co-written by PAL Robotics’ Antonio Andriella and explains how in recent years there has been an increasing interest in deploying robotic systems in public environments, and how having to deal with untrained users adds further complexity. In this work, a Cognitive System that relies on planning is extended with adaptive capabilities and embedded in a TIAGo robot. The result is a system able to help a person to complete a predefined game by offering various degrees of assistance.

Towards using Behaviour Trees for Long-term Social Robot Behaviour

Séverin Lemaignan and Sara Cooper, from PAL Robotics HRI team worked on this paper that introduces a Behaviour Tree based design of long-term social robot behaviour in the context of SHAPES project, using ROS-compatible libraries, specifically two types of behaviours: a robot idle behaviour where the human approaches and begins the interaction, and a second behaviour where the robot actively navigates and searchers for a specific user to deliver a reminder. 

Kinematically-consistent Real-time 3D Human Body Estimation for Physical and Social HRI

This paper written by Lorenzo Ferrini and Séverin Lemaignan presents a software tool, fully integrated with ROS, that enables robots to perceive people’s full body in 3D. The system works either with a simple RGB camera, or a RGB-D camera for better 3D absolute position estimation. In the context of Social Assistive Robots (SARs), and more in general in Human-Robot Interaction, it is fundamental for a robot to be aware of the 3D pose of the human it is interacting with. Obtaining good and efficient results in this direction represents a stepping stone toward complex supportive and collaborative tasks.

Read here the interview with Séverin Lemaignan on social robots for stronger bonds.

ARI and TIAGo social robots for HRI 

ARI is a mix of Service Robotics and Artificial Intelligence in one single platform, specifically designed for HRI. ARI speaks 30 languages, shows information/applications on the touchscreen on the robot’s chest for user interaction and multimedia content, as well as having a voice and facial recognition system. 

TIAGo the mobile manipulator fits and adapts to any research needs, not the other way around. The robot’s HRI abilities including a voice and facial recognition system, as well as multiple languages and a touch screen, which make TIAGo well suited for applications in the healthcare sector. The robot has been chosen by several European Projects as a research platform.

To read more about PAL Robotics’ service robots and their capabilities in AI, visit our website. Read our previous blogs on AI in service robotics here:

Remember to check our social media channels regularly to hear more about our AI applications and latest research, and to ask more about the robotic solution that may be right for you, don’t hesitate to get in touch with us