In recent research the team at PAL Robotics and our French partners, TOWARD and Dynamograde lab, using a new framework, enabled the biped robot, TALOS to navigate and climb up to 15cm stairs without prior knowledge of the environment for the first time. Read on for our interview with Pierre Fernbach and to find out more about this research, which aims to help humanoids develop the skills needed for future work in industry.

TALOS navigating with limited knowledge of the environment and limited instructions

Pierre Fernbach, Software Engineer at PAL Robotics’ French partner, TOWARD explained this project and its importance for the progress and development of humanoid robots, telling us, “This project was the first time we have used this approach on life-sized robots, such as PAL Robotics’ humanoid biped, TALOS. The project started about six months ago, but is based on years of prior development and research. We wanted to work on generalised locomotion – for example to request a robot to navigate to a certain point, and to be able to reach that point without providing the robot with further information, even if it means that the robot needs to climb stairs, open doors, etc, to reach that point. With humanoid robots in general, for them to help in industrial or domestic environments for example, this is a very beneficial goal.” 

Pierre added, “Research papers and their approach, including, ‘SL1M: Sparse L1-norm Minimization for contact planning on uneven terrainhelped to inspire this research.” 

An additional challenge is that for legged robots including TALOS, following a pre-planned trajectory can also present complexities due to the possible accumulation of errors in the control and estimation processes. To cope with these the robot needs to be able to re-evaluate its plans. Here the team worked with a new framework that helps the robot to do just that by creating a local map of the environment. For this, the ground in front of the robot is captured using the LiDAR sensor on the robot’s base.

The robot base state is then re-constructed using the IMU and the kinematics odometry, and an elevation map is constructed. Then a contact planner is used to compute a path in the measured environment. From the contact sequence, a trajectory is computed for the next few seconds of the walk using a model predictive controller.

Finally, the joint torques are computed using inverse dynamics and sent to the robot’s actuators. With this framework, the biped robot, TALOS, for the first time was able to climb 15cm stairs without prior knowledge of the environment. 

Pierre told us, “For this project we have been collaborating with our research partner, LAAS-CNRS as the contact planner was originally developed by them.” Pierre continued, “In order to control TALOS in torque, we have been using inverse dynamics and a stabiliser for several years already with good results. However, to use these for dynamic motions, both on flat floor and complex terrain, we implemented an MPC method that generates the reference trajectories that we send to the inverse dynamics.”

 

Pierre explained, “As shown in the video, this method worked well and allowed us to generate different kinds of dynamic motions in challenging environments, but for the complex terrains we needed to provide the position of the footsteps. So the next step was to compute these footsteps automatically. For this we needed perception, the robot already had LiDARs but we needed software that is able to correctly extract possible contact surfaces from the sensor’s data. We implemented our solution based on open source state of the art libraries for perception and mapping. We also used a state-of-the-art contact planner. After this, we had the footsteps and could loop with the method mentioned previously.”

Illustration of “generalised locomotion” with TALOS in the experimental room of the LAAS-CNRS

Impact of research including contributing to greater autonomy of humanoid robots

Pierre told us, “We believe projects such as this one will contribute to greater autonomy of humanoid robots, with less need to define exactly how the robot will be used.” 

“In relation to this work, we also recently made a video with TALOS where the robot walks up stairs, following this, we move the stairs, and the robot then needs to adapt to the new position of stairs. Our goal is to give general tasks to humanoid robots such as TALOS and not tell the robot exactly how to do them.”

Pierre added, “We have a software architecture with blocks that are connected together with ROS for solving challenges such as these – where humanoids are given general tasks and not told how to do them. However, if a customer has a specific need we can adapt this software architecture for them. The customer can also replace any of the blocks in order to experiment with their own methodology.”

This figure is the final “autonomous” version used at the end of the project. The blocks “Stabiliser, Inverse Dynamics, Estimator” already existed and were made by PAL Robotics, the block “MPC” was made by TOWARD, and the block “Perception” was made in collaboration between PAL Robotics and TOWARD. For the “Contact planner” block we used an open source package.

TALOS locomotion integrated with ROS to be run fully onboard the robot 

Pierre told us, “The project is now complete and the software is operational, however, we will of course continue working on it to make a second version. The walking work will also be applied to our new biped robot Kangaroo in the future.”

Pierre concluded, “With this project, we have worked on the implementation of an architecture for TALOS locomotion in unknown environments, that is able to be fully integrated with ROS, and can be run fully onboard the robot.”

Future work in Whole-Body Motion Predictive Control for TALOS and Kangaroo 

Regarding future projects, Pierre explained, “We also plan to work on a project in Whole-Body Model Predictive Control – for both TALOS and Kangaroo robots. This will be a way to generate more optimal and dynamic motions using the whole bodies of the robots.” 

“In terms of future work on TALOS only, we would like to focus on making TALOS’ perception block more robust, improving the robot base estimation with visual odometry, fusion of the robot head and waist sensors, and global path planning.”

We would like to thank Pierre Fernbach for taking the time to talk with us! To learn about the capabilities of our advanced biped research robot TALOS and Kangaroo take a look at our website. Finally, if you would like to ask us more about TALOS or Kangaroo as research platforms for your organisation, do not hesitate to get in touch with us. For more news on our research and robots, follow our blog on robotics and technology!