Pal Robotics Integration of Deep Learning tools on TIAGo robot for healthcare usecase

5 July 2022

At PAL Robotics the team has been integrating deep learning tools on the TIAGo robot (which stands for Take It And Go), some of which are ready to be implemented in the healthcare use case. PAL Robotics’ TIAGo robot is taking part in this use case, which has the aim of testing various capabilities in order to see their potential for the future. The main objective of the healthcare use case is to build a versatile robotic assistant that helps with the wide range of tasks a patient would do on a daily basis, based on the TIAGo robot manipulator.

Examples of these daily tasks are:

  • detection and identification of daily life objects (for example bottles of water, clothes, drawers, chairs, and tables)
  • person detection (for example recognising the identity of people appearing in a scene)
  • speech recognition of basic action commands (for example assigning tasks to the robot using simple speech commands.)

In the healthcare use case, TIAGo serves as an end-user assistant by receiving visitors and taking and delivering objects to them. To do all this, TIAGo needs to have incorporated human presence recognition, recognition of activities and vocal instructions, and detection of emotional states to enable a person-centric human interaction.

Indeed, in this use case TIAGo robot will check  the healthcare environment in order to detect people and in particular look out for people who have fallen, in order to call for help as necessary. In the case that the robot encounters a standing person the robot will approach and aim to recognize the person’s name, activity, and basic emotional state, then act depending on the situation.

In addition, another main aim is to endow TIAGo with a series of object detection and manipulation capabilities, such as ‘open drawers’ or ‘bring a bottle of water,’ that can be remembered and activated by the robot, improving Human-robot Interaction in research and increasing the potential for robotics platforms such as TIAGo to help more in our everyday lives.

The team at PAL Robotics are using multiple capabilities developed in the recently developed OpenDR toolkit on the TIAGo robot to get the robot ready for the healthcare use case. These include:

  • Pose Estimation
  • 2D Object Detection
  • Face Detection
  • Panoptic Segmentation
  • Face Recognition
  • Semantic Segmentation
  • RGBD Hand Gesture Recognition
  • Heart Anomaly Detection
  • Video Human Activity Recognition
  • Landmark-based Facial Expression Recognition
  • Skeleton-based Human Action Recognition
  • Speech Command Recognition
  • Voxel Object Detection 3D
  • AB3DMOT Object Tracking 3D
  • FairMOT Object Tracking 2D
  • Deep Sort Object Tracking 2D

We have tested these deep learning tools on TIAGo robot and will use them in the future health use case. The testing and feedback that is being gathered is essential for those tools to be more useful for robotics.

Person/Face Recognition

One of the deep learning tools implemented on TIAGo is person/face recognition, designed to recognise the identity of a person. The algorithms should also be able to decide whether a person is known or not. You can see this tool being tested on TIAGo robot in this video:

Video: https://www.youtube.com/watch?v=98BDEv-ZX5I&t=7s 

Figure 1: The facial recognition tool being used on TIAGo robot at PAL Robotics

In this task the robot will be asked to find a person that is in its database and in a particular location. The behaviour that it follows will be:

  • Roam the map
  • Detect and recognize the persons it encounters
  • If given an instruction through command combinations like <fetch, name, (location)> it should understand its purpose
  • The robot will go to the specified location, if none is given it will move towards specific waypoints and search there
  • Once found it will ask the person to follow and return to the previous person. If not found the robot will call for him and try again at different waypoints. If at the second attempt it is still not able to find the person it will return to the previous person saying it did not succeed.

PAL Robotics’ TIAGo platform includes perception abilities such as facial recognition, emotion recognition, and object and people detection. TIAGo has autonomous navigation and localization, mapping and obstacle avoidance, all accessed through ROS.

NVIDIA Jetson is one of a number of applications that enable AI in service robotics. At PAL Robotics, we work with NVIDIA® Jetson™ TX2 which provides speed and power-efficiency in an embedded AI computing device. Through this application we have worked to provide the TIAGo robot with perceptual competencies. In Open DR we are using the new Nvidia Jetson NX which is an external board to launch AI programs that is connected to TIAGo robot to provide more processing power.

In summary, we are very happy to be able to provide the consortium with feedback on capabilities such as Person/Face Recognition to help improve the performance of these deep learning tools, as well as to integrate them into the upcoming healthcare use-case to improve the abilities of robots to help us more and more in our daily lives.

Authored by: Lorna Mckinlay, Julia Atsu Romero, Thomas Peyrucain, Gizem Bozdemir

PAL Robotics, Spain