TIAGo Mobile Manipulator Robot: a platform for navigation
Whether your research platform needs include manipulation, navigation or perception, our versatile mobile manipulator TIAGo platform (which stands for Take It And Go) is ready to adapt to meet your research needs and not the other way around. In this series of blogs on TIAGo for research, we start with how to use TIAGo for research focused on navigation.
The robot is a mobile manipulator well-suited to applications in the healthcare sector and light industry, specifically for its modular design, as demonstrated by the several European Projects that choose it as a research platform. TIAGo is a mobile manipulator suitable for research in areas including Factories of the Future, Smart Cities and IoT, as well as Ambient Assisted Living.
- TIAGo has autonomous navigation including mapping, localization and obstacle avoidance all accessed through ROS.
- In terms of manipulation, TIAGo’s arm has a large manipulation workspace, thanks to the lifting torso, being able to reach the ground as well as high shelves. The end effector can be a gripper or a humanoid hand and they both can be quickly exchanged for performing various manipulation tasks.
- TIAGo includes perception abilities such as facial recognition, emotion recognition and people tracking.
Looking more specifically at navigation, here are some of the navigation features of TIAGo and how to use them.
Check out our bi-manual omnidirectional mobile manipulator TIAGo Omni++.
In ROS, to start, change to mapping mode, start rviz for navigation, and move the base with the joystick in order to take a tour of the area you want to map. Mapping needs to be done just once as TIAGo’s navigation system is able to adapt to changes that occur over time. However, a new map of the same environment should be redone when there are major changes to the layout.
Localization is provided by the ROS package amcl, which implements a particle filter to track the pose of the robot against the current map. When the amcl filter is uncertain about the actual location of the robot the cloud of particles spread or even split into different clouds. Nevertheless, when the amcl filter encounters a lot of similarities between the current laser scan and the local part of the map uncertainty reduces causing the particles to focus, which provides a more accurate localization.
Send a navigation goal by choosing the target map coordinates, for example using the 2D Nav Goal button in Rviz. You can find extensive documentation about navigation functioning and parameters.
The advanced navigation TIAGo package includes the following enhanced functionalities implemented by PAL Robotics:
- Navigation to Points Of Interest (POIs) or to a sequence of POIs
- Detection of Regions Of Interest (ROIs) → topological localization
- Avoidance of Virtual Obstacles (forbidden regions)
- Obstacle avoidance using laser scan and point cloud from RGBD fusion
- Map Editor Rviz plugin to graphically perform the following operations:
- Download/upload maps from/to the robot
- Define POIs and sequences of POIs, ROIs and Virtual Obstacles
- Command the base with a virtual joystick
The laser scanner is used by default to detect obstacles and let the navigation avoid them, however this is less comprehensive than when using Advanced Navigation. Extending these capabilities through Advanced Navigation means when the robot is sent to a map point the torso raises and the head lowers so that the RGBD camera can detect obstacles in front of the robot that may not be detected by the laser scanner.
Discover how to easily simulate TIAGo in ROS.
Map points of interest
- Points of Interest (POIs) represent specific poses (position and orientation) on the map, identified by a unique name.
- Using the go_to_poi action the robot can be sent to any defined POI by name.
- POIs can be grouped in POI lists or waypoint lists, an ordered series of POIs that the robot will follow as a path.
Zones of interest and virtual obstacles
- Zones of Interest (ZOIs) are polygons that define a topological area. These are useful to understand where the robot is on the map, for example the “kitchen”, “the living room”, etc. without needing the exact map coordinates.
- Similarly Virtual Obstacles (VOs) are polygons that define a “forbidden” area for the robot.
- VOs are transformed into obstacles in the robot’s costmaps, acting as walls and preventing the robot from getting into or out of that area.
- Using the RGB-D camera TIAGo is able to avoid obstacles that are undetectable by laser (such as tables or chairs).
- To do that the PointCloud provided by the camera is continuously integrated with the Laser scanner.
- While navigating with the RGBD camera the head_manager node will be stopped to prevent undesired motions of the head. In addition, the posture shown on the image will be adopted by TIAGo (semi-raised torso and head looking down), to ensure obstacles fall in the camera Field of View. Once the goal is reached the head_manager node is enabled again.
TIAGo is used in European research projects such as Open DR (creating an open deep learning toolkit), SHAPES (deploying digital solutions to support healthy and independent living for older individuals), SeCOIIA (secure digital transition of manufacturing processes), RobMoSys (building an open and multi-domain European robotics software ecosystem), SIMBIOTS (introduction of robotics in new industrial processes and applications) and NeuTouch (improving artificial tactile systems and building robots that can help humans in daily tasks). Recently, TIAGo was also used in EnrichMe (supporting older people in Ambient Assisted Living), SACRO (developing a semi-autonomous service robot for the care market) and Co4Robots (for collaborative robots in industrial settings).
TIAGo has ROS tutorials available to get started with the robot’s open source simulation, which you can download online. Stay tuned for more blogs in this series of TIAGo for research. To find out more about PAL Robotics and the different TIAGo customizations available visit our blog. Don’t hesitate to get in touch with our experts to discuss how our mobile manipulator robot can meet your research needs.