IEEE Humanoids 2017, the future of humanoid robots at Birmingham

TALOS-discobolo-humanoid-robot

One of the pictures PAL Robotics submitted to the Humanoids Photo Contest. Click and vote!

This November is an exciting month for us and for our robots! It began last Wednesday, when our CEO Francesco Ferro was invited as an expert at the European Parliament to take part in the “Transferring Robots to the world of SMEs” conference.

Now we’ve got a date with the humanoid robotics community next week in Birmingham, at Humanoids 2017! Top stakeholders and researchers in the field will participate in three intensive days of workshops and debates that deepen on the capabilities, potential and challenges that humanoid robots have nowadays. Check out the full programme here!

Bipedal humanoids have always been at the heart of PAL Robotics, which is why we are Gold Sponsors of Humanoids 2017. We endorse this ambitious robotics field, since it enormously contributes to the whole robotics state of art and opens multiple possibilities for a future of service robots.

Where to find us at Humanoids 2017

For the first time TALOS is joining the team that is travelling to the UK altogether with REEM-C and TIAGo. All of them have new demos ready to be shown at the conference exhibition! High-performance robot TALOS is the next step PAL Robotics made in bipedal robots after the proven reliability of the also human-sized REEM-C.

Where do we envision humanoid robots? What will they be capable of? The workshopHumanoid Robot for Real Applications Use”, organized by F. Kanehiro and A. Kheddar on Wednesday 15th is rising a debate on the next steps that have to be done in order to introduce collaborative humanoids at homes, industries and public spaces. As experts in bipeds and in robust walking control, our CEO will share our know-how with the workshop participants.

One of the papers that will be presented on Friday 17th (15h) is: “TALOS: A New Humanoid Research Platform Targeted for Industrial Applications”, by Olivier Stasse et al. The paper describes TALOS’ kinematics design, which enables the robot to adapt to a human environment and perform industrial tasks. High quality components are also pointed out, such as the torque sensors in all joints or the fast EtherCAT communication bus inside.

We are truly motivated to attend Humanoids 2017 and we hope to see all of you there!

Posted in Events | Tagged , , , , , , , , , , | Leave a comment

Co4Robots: diverse robots, united as one team

Over the last years, the robotics market has been filled with many new models of robots. This tendency leads us to imagine the future with multiple robots helping us around and performing a rich variety of tasks, which sounds amazing. But such situation also raises a relevant challenge: how will all these diverse robots coordinate?

Saving time and optimizing resources will be best achieved if all the platforms are integrated in a multi-robot system, even if there are different developers behind each robot. This is the overall goal of a H2020 EU Project: Co4Robots. The work will focus on decentralizing the control and coordination of heterogeneous robots that should interact between them and with humans.

Co4Robots_PAL-Robotics-Oct2017

Consortium meeting & Integration week at PAL Robotics

20171019_095827The first developments with TIAGo were a success! All the Co4Robots partners met in Barcelona from 16th to 20th of November to work intensively on the project. After the consortium meetings on the first days, a TIAGo robot was used to perform some tests on the first milestone of the project: collaborative grasping between the robot and a person.

It was a pleasure to receive the consortium partners from KTH, Bosch, NTUA, UGOT and FORTH at PAL Robotics’ offices, sharing different experiences and building this project together! We look forward to the next steps of Co4Robots!

Posted in Events | Tagged , , , , , , | Leave a comment

ROSCon & IROS: Thank you, Vancouver!

This year ROSCon consolidated more than ever as the landmark event for the Robot Operating System developers community. Such general impression was encouraged by the rise of partners and sponsors – such as PAL Robotics – that broke all records. As usual, ROSCon organizers have conveniently made available all the conference talks online! These are very useful state of art resources for all the developers, even if some of them couldn’t make it to Vancouver.

PAL Robotics’ software developments and TALOS Space Robotics Challenge

Our CTO Luca Marchionni outlined PAL Robotics’ software development and how we use ROS tools at his talk in ROSCon. The Whole Body Control was one of the highlights, a software system customized by our team to develop high-level applications by abstracting the complexity of a robot, especially when it has a large number of DoF. This is the case with humanoids TALOS, REEM-C or TIAGo.

In fact, this software was extensively used by Team Olympus Mons, a team made up of current and former PAL members – including Marchionni – who won the 3rd place at the NASA Space Robotics Challenge. Thanks to our software modularity, PAL Robotics only spent two days implementing the developments achieved for the Space Robotics Challenge to our latest humanoid TALOS’ simulation, as shown by Marchionni in a video at the talk. And very soon we will see the actual TALOS humanoid robot performing those tasks!

TIAGo adventures at IROS&ROSCon

When we bring some of our TIAGo robots to an event, they always have such a productive time! After visiting Vancouver and listening to ROSCon conferences, TIAGo also spent five days enjoying the 30th anniversary of IROS.

IROS was a good occasion to share our latest developments; our CEO gave a talk within the frame of the RSJ-IAC Lunch surrounded by an attentive audience. Some of the latest advances we exposed are related to the European Projects that we are part of, such as Co4Robots, EnrichMe, SocSMCs, RobMoSys or GrowMeUp.

In short, IROS&ROSCon continue to be an absolute must to learn the latest cutting-edge developments in robotics. PAL Robotics team could speak with many researchers and companies that are making a great job. Our feeling was that the whole robotics community was making efforts to get these robotic platforms ready to enhance people’s quality of life. It was outstanding, and we are extremely glad to be part of this.

IROS has just ended and we are already looking forward to next year, IROS 2018 at Madrid; Meanwhile, do not miss our next adventures over at Birmingham for Humanoids 2017!

Posted in Events | Tagged , , , , , , , , , , , | Leave a comment

SocSMCs: making robots interact without words

When we shake hands with someone or help them moving furniture, our body is calculating much more than we know, even though we are not consciously aware. The direction in which our hand is moving, how much force is needed, how we keep the balance meanwhile… These are automatic decisions that are daily taken by our body. And while it goes on, we don’t realise anything.

Despite its complexity, this is easy for people. Instead, with robots it is a whole different thing. Robots need to have all the calculations programmed to behave in a way we find “normal”, or “socially accepted”, even speaking of what we would say that is a very basic level of motion control. They don’t have any intuition nor social education as a basis for their actions. But such interactive behaviour is highly needed for robots if we want them to adapt and help in our life.

One of the EU research projects we are involved in, SocSMCs, is focused on studying socially relevant action-effect contingencies in human-human and human-robot interaction scenarios. With the goal of achieving a robotics technology that is socially competent, some of the SocSMCs tests are being done with PAL Robotics’ humanoid robot REEM-C.

First steps: a friendly and responsive handshake

One of the first things that has been developed with REEM-C for the SocSMCs is to enable the humanoid to give a proper handshake, grasping the other’s hand with a controlled force and following the other person’s movements.

The challenge is to create robots that work in human environments in a more natural way for people. The robot shall be able to understand the non-verbal commands that the human uses all the time and react accordingly.

What is REEM-C learning now?

SocSMCs has more plans for our humanoid robot – such as taking a group selfie or helping someone moving a piece of furniture without speaking. REEM-C is currently learning the steps to accomplish the latter one, which are:

  1. See and recognize the object (the furniture)
  2. Reach out to the object altogether with the person with an autonomous navigation
  3. Grasp the object and lift it coordinating with the person
  4. The robot will move the object without explicitly being told just “guessing the human intentions” using its sensors

Studying these cases will help when applying similar developments to other multiple situations in which a robot needs to have an assistive role of cooperation and human-robot interaction.

Posted in Articles | Tagged , , , , , , | Leave a comment

ROSCon & IROS 2017: Towards a human-friendly robots horizon

Friendly people, friendly robots”. This is the theme of IROS 2017, held in Vancouver next week, considering that many stakeholders – PAL included – foresee a future with robots all around us, enhancing our daily life and helping us grow as humans.

ROSCon&IROS 2017 PAL RoboticsTIAGo is joining our team traveling to Canada! Our endearing robot is so excited to meet everyone at this major robotics event! Find us at the IROS’ exhibition area and test TIAGo’s collaborative capabilities. Its features are specially designed to work hand by hand with humans and assist both in industrial and domestic environments, always with a cute smile on its face!

Learn more about PAL Robotics’ philosophy and actions at RSJ-IAC Lunch, with a presentation held by the company’s CEO, Francesco Ferro (Sept. 25th – 12:30h). There you will have the opportunity to discuss with our team about the future of robotics.

ROSCon 2017: Joining efforts to boost robotics state of art

We are fully convinced that the best way to integrate robotics in human environments can only be achieved through joint efforts so we do not have to reinvent the wheel every time. ROS is a robotics middleware that provides a common framework for people working in robotics. Our commitment with ROS as a common framework for the robotics community has brought us to sponsor ROSCon 2017. We believe the conference will boost the software development by sharing everyone’s experience with the most common challenges in robotics and their approach in finding a creative and elegant solution.

Our background in designing ROS-powered robots will be exposed at ROSCon by PAL Robotics’ CTO, Luca Marchionni (Sept. 22nd – 14:30). The totality of our robots, from the small mobile bases to the human-sized bipeds, use the ROS framework, and our team constantly reviews and improves its status. One of the secrets to building our modular, flexible, configurable and testable robots is following some of the ROS guidelines.

The control software architecture, based on OROCOS and ros_control, will be presented together with the ros_controllers we’re currently using. We will focus, in particular, on our approach to Whole Body Control as an efficient redundancy resolution controller that allows to generate real-time motions on anthropomorphic robots.

For those of you who will assist either ROSCON or IROS 2017, our team will be happy to welcome you to our stand and show you what our robots are able to do.

Posted in Events | Tagged , , , , , , , , , , | Leave a comment

The European Robotics League is coming to Barcelona!

Robots from all over Europe will descend on Barcelona for the European Robotics League’s Local Tournament, 20 – 24 November. It’s the first time the Spanish city has hosted the European Robotics League (ERL), and it will be held at PAL Robotics’ office.

We’re incredibly excited to announce that the call for participation is now open! The deadline for applications is 30 September, and successful teams will be announced on our website by mid-October.

Important dates:

  • 30 September 2017 – Deadline for application and request for travel support
  • 11 October 2017 – Qualification announcement
  • 20 November 2017 – Competition begins
  • 24 November 2017 – Competition ends

What is the ERL?

Made up of a series of Major and Local competitions, held at certified testbeds across Europe, the ERL is a key touchpoint in the robotics industry’s calendar. The Service Robots branch of the competition requires teams to successfully program robots that can navigate a typical home environment and complete set task and functionality benchmarks.

Keep an eye on the Local Tournament’s website to see what our testbed will look like.

As a Platinum Sponsor of the ERL – Service Robots Competition, we have lent TIAGo Steel robots at just €650 a month to four lucky teams who applied to use the robot to participate in the competition. Take a look at our blog to see who the lucky winners were!

The TIAGo Steel is a collaborative robot with a mobile base, 14 degrees of freedom, 6kg payload arm and gripper. Sold for around €49,000, the robot is an ideal platform for research into sectors such as healthcare and assisted-living.

We will also provide these teams with a free TIAGo Steel to use during the Tournament. As a result, it is set to be one of the best attended Tournaments of the 2017/2018 ERL Season.

How to apply

Teams that want to apply must send an email to erl.service@robotics-league.eu with the following information:

  • Team name
  • Team leader’s name
  • Total number of team members (including the team leader)
  • Affiliation
  • Contact information (e-mail and phone number of the team leader)
  • Website
  • Qualification material (see below)

Applicants are required to submit the following qualification material (download the form):

  • Team description paper (max. 4 pages) explaining the team’s approach to the technical challenges, including a brief description of software and hardware architectures, the relation to the team’s research, and a list of relevant publications
  • Video showing evidence of some performance in the tasks and/or functionalities of the Tournament
  • Completed travel support request form, if needed

Teams will be selected for participation (and travel support) based on:

  • Quality of application paper and research, as measured from the TDP and list of relevant publications
  • Quality of test performed in simulation, if required (due to a high number of applications)
  • Available funds for travel support (maximum of 2K€ per team per year)
  • Financial needs (as declared by the team’s request for travel support)

For more information, please visit the Local Tournament’s website, or contact:

Carlos Vivas – erl@pal-robotics.com

 

Posted in Uncategorized | Tagged , , , | Leave a comment

The intern guide to SLAM part II: loop closure & new framework

Next up in our SLAM series is intern Tessa Pannen, who is studying a Computational Engineering Science MA at the Technische Universität Berlin. As she’s coming to the end of her six month internship, we quizzed Tessa about the successes and challenges of her new verification process for loop closure and framework for SLAM.

This may be an impossible task, but how would you describe SLAM in less than 50 words?

SLAM – simultaneous localisation and mapping – is a technique robots use to record their surroundings using sensors. They are then able to draw a map and estimate their localisation inside that map. Loop closure is needed to validate the robot’s estimated position in the map and correct it if necessary.

Can you tell us a bit more about the loop closure process?

Loop closure is the ability of the system to recognise a place the robot has previously visited. In visual SLAM, we use RGB cameras to collect information and evaluate whether a robot has “seen” a place before.

A camera integrated in thtessa 1e robot continuously takes pictures of its surroundings and stores them in a database, similar to a memory. Because a robot can’t “see” the way humans do, in order to compare the images, it has to break them down into prominent features such as corners or lines. As their properties can be stored as binaries, these features can be easily compared by an algorithm.

We then extract a limited number of promising candidate images from the database for the loop closure. These candidates are very likely to show the same location as the current image based on the fact they contain a high number of features with the same or similar properties. We can check which is the most likely to show the same location as the one the robot currently “sees” by comparing several geometrical relations between the features of a candidate image and the current image.

Take a look at our video if you’re wondering what loop closure looks like.

How does your work differ from existing research?

In the first part of my internship, I tried a different approach on the candidate geometrical check by comparing the features of three images instead of two. This basically means adding another dimension to the geometrical relation between the images, so a 2D matrix becomes three dimensional. Part of my research involved finding a stable way to compute this transformation using only the matched feature points in the images. 

tessa 2

In the second part of the internship, I joined my supervisor in working on a SLAM framework that we hope will manage all the tasks of SLAM by the time it’s finished, namely mapping, localisation and loop closure. I’d never contributed to such a complex system before, so it was incredibly exciting.

For this task, I implemented several new ways of sourcing a list of loop closure candidates, based on the likelihood of the current and all previous positions instead of feature comparison – a method known as Nearest Neighbor Filter.

What are the next steps in your research?

Tests, tests, and more tests! The only way we’ll know how the new approaches influence the whole loop closure process is by running vigorous tests.

I know you can’t give away too much, but are the results in line with what you expected?

The computation of the transformation works well for simple test images like the one above. We don’t yet know if it’s robust enough to support the loop closure, since we still need to run the tests. It might improve its performance, it might not. But doing research is about trying new things, and there’s no guarantee for success. Keen an eye on the blog – hopefully we will be able to publish another exciting new video!

What have been the highlights so far?

I love the satisfaction you get each time you reach another milestone in your project and see your code slowly developing and actually working. I hope we can make the loop closure run on a real robot by the end of my internship – that’s the big highlight I’m looking forward to.

Visualization of the loop closure process is important in order to find bugs in the code. But when it finally works, it’s hugely exciting to see a simulated robot driving in a loop, highlighting the areas it recognises, and I think: “wow, it’s actually working! I did this!”

What do you think has been the biggest challenge?

Because I wasn’t familiar with any concepts of computer vision, I had to brush up on my maths skills, especially geometry! There was a lot of dry theory I had to absorb in the first few weeks. I also needed to familiarise myself with the existing code of the SLAM project and understand what’s happening in the bigger picture. It can be challenging to dive into a big project without loosing the overview.

How do you think what you’ve learned so far will help in the final year of your degree?

I’ve definitely learned to organise myself better. I now plan what I want to do carefully before I write a single line of code. I also take more notes, which makes it easier to reconstruct my own thoughts and helps me stay on track and not get lost in side tasks. Most importantly, I’ve learned to realise when I need help and ask for it. Sometimes when you’re stuck, you have to swallow your pride and ask for help if you don’t want to loose more time on a problem.

Here’s a video of the loop closure process:

Posted in Uncategorized | Leave a comment

TIAGo Steel loans for European Robotics League: winners announced!

Back in May, we announced our intention to lend TIAGo Steel robots to teams who wanted to enter the European Robotics League – Service Robots competition (ERL).

We had a fantastic number of participants, and are thrilled to see so many researchers, students and companies excited to test their software applications on our robot.

Drum roll, please…

Unfortunately, there can only be… four winners! Congratulations to the four lucky teams who have been successful in their bids to borrow TIAGo Steel robots.

ParticulaTIAGo_Steel_with_force_torque_UNIBO_02r congratulations to Team Homer from the University of Koblenz – Landau, who submitted an excellent application that represented both their breadth of experience and the extent to which they would benefit from competing in the tournament. As a result of their incredibly thorough application paper, prior ERL experience and clear passion for their academic specialisms, they have been awarded the free robot.

All robots are eligible to rent for up to 12 months, at just €650 a month (plus tax). Conditions of the loan include mandatory participation in at least one ERL – Service Robots tournament and a minimum loan contract of three months.

The TIAGo Steel robots are now ready for collection.

Meet the teams

Institut de Robòtica i Informàtica Industrial, CSIC-UPC.

A strong interdisciplinary team from the Institut de Robòtica i Informàtica Industrial, led by Sergi Foix Salmerón, Guillem Alenyà Ribas and Juan Andrade Cetto, have more than 10 years’ robotics experience. Made up of researchers and students from the university’s Mobile Robotics & Perception and Manipulation groups, the team’s specialties include localisation, mapping, planning and learning algorithms, and computer vision applied to human-robot interaction.

They’re looking forward to the challenge of integrating and testing the different algorithms created by both groups on a single platform (TIAGo), and hope that participation in the tournament will allow them to push the boundaries of social robotics.

A consortium between the Disruptive Hub of everis-NTT and the Polytechnic University of Catalonia (UPC)

The application from a consortium of Everis employees and students from UPC’s Robotics Master Program combined the fresh outlook of students from academia with the experience of a company with more than 100K employees.

The students from UPC have a range of specialties, including automatic control systems, computer vision, robotics computer technology and communication in processing elements, that stand them in good stead to compete in the ERL. They will be led by Dr Cecilio Angulo from UPC and Dr Jordi Albo-Canals from everis-NTT, who both have previous experience participating at Robocup at Home.

Universidad Carlos Tercero de Madrid

The RoboticsLab research group, together with the Student Robotics Association (ASROB), from the Universidad Carlos Tercero de Madrid, are the third team to be loaned a TIAGo Steel robot.

They will be led by Dr. Juan González Victores and Dr. Santiago Martínez de la Casa, who have expertise in areas ranging from SLAM and 3D vision processing to grasping and manipulation planning.

The University of Koblenz – Landau

A passionate group of students who have competed in multiple Robocup and ERL competitions over the past decade, Team Homer from the University of Koblenz – Landau demonstrated the experience, innovation and passion required to receive the free loan of a TIAGo Steel robot.

The team was set-up to offer students the opportunity to learn about visual computing and computer science through practical participation in competitions like the ERL.

The TIAGo Steel robot

TIAGo_Steel_03_transparentThe TIAGo Steel robot is one iteration of our mobile manipulator TIAGo. The robot has a mobile base with a combination of lasers, sonars and actuated motors, a max speed of 1m/s and batteries that provide up to 10 hours of autonomy.

The Steel version of TIAGo has a parallel gripper that can be changed to a five finger hand or force-torque sensor. The arm has a workspace of 86cm and maximum payload of 3kg, without the end-effector.

With 12 degrees of freedom, a lifting torso (35cm) and 100% ROS-compatible open-source software, TIAGo is a versatile robot with modular capabilities that make it perfect for research into healthcare, assisted living or light industry.

ERL in Barcelona

As Platinum Sponsors of the European Robotics League – Service Robots competition, we are hosting one of the local tournaments here in sunny Barcelona! Keep an eye on our blog for further details, but what we can reveal at the moment is:

Date: 20-27 November

Location: PAL Robotics office (10 minutes from the beach!)

If you have any questions about TIAGo, the ERL Local Tournament in Barcelona – or anything else! – send us an email: business@pal-robotics.com

Posted in Articles | Tagged , , , , , | Leave a comment

Behind the scenes of Robocup: hacking TIAGo

If you saw the modified TIAGo that competed at Robocup 2017 in Japan, you probably noticed that the robot looked a little different. That’s because it was modified by the Technical University of Munich, specifically for the Robocup@Home Open Platform League, and aptly renamed TIAGo@Home.

Keep reading to find out how the Alle@Home team hacked our robot.

TIAGo_hacked_figure

Directional microphone

Although TIAGo comes equipped with a stereo microphone, the Alle@Home team added a large directional microphone on top of the robot’s head in order to pick up sound and enhance the robot’s speech recognition capabilities. Robocup can be pretty noisy, and the team wanted to make sure their robot could hear and respond to commands.

New computational capabilities

Our TIAGo robots always come with a laptop tray, but it’s up to clients what they do with it! Whether they use it to programme the robot, expand their world on Minecraft or rest a much-needed coffee – we’re not here to judge…

The Alle@Home team used the laptop tray of TIAGO@Home to add a new computer that enhanced the robot’s computational capabilities. In a (technical) nutshell, they used deep learning algorithms running over CUDA in a NVidia GPU to detect people and objects moving around the competition.

Artificial Skin

The team also added Artificial Skin – technology developed by scientists at the Technical University of Munich - to TIAGo@Home’s gripper.

The Artificial Skin was created to increase robots’ multi-modal sensitivity, so that they can better understand and interpret their environment. Comprised of a large number of small hexagonal boards, the skin features energy-saving microprocessors and sensors to detect changes in speed, temperature and touch.

The skin boosted TIAGo@Home’s sensory capabilities, for example by helping the robot to sense proximity to other objects in three dimensional space. These capabilities were crucial at Robocup, as they helped the robot detect the proximity of objects in grasping tasks and fine command the gripper in order to grasp and place them effectively.

Customized 3D printed gripper

The specific chaTIAGo_grasping_bottlellenges facing the Alle@Home team called for specific hardware hacks.

In order to grasp and move target objects, the team decided to 3D print new gripper fingers. Although TIAGo comes with an interchangeable five-finger hand or parallel gripper, the new gripper enabled TIAGo@Home to move the objects with more accuracy and precision in the specific Robocup@Home tasks.

USB hub

In order to connect the new microphone to the robot, the Alle@Home team added a USB hub. The hub is being powered externally by TIAGo’s expansion panel, which also contains a 12V connector providing 5A, a CAN service port and ethernet ports.

A world of no limits

TIAGo is an open platform with a huge array of possibilities for expansion. The robot’s modularity and flexibility, as well as its 100% ROS-compatible open-source software, mean that there are virtually no limits to the applications you can incorporate.

The Alle@Home team added software and hardware hacks to suit their own requirements, and successfully made it through the first rounds of Robocup. Congratulations, Alle@Home!

How would you hack your TIAGo? Let us know: tiago@pal-robotics.com

If you can’t afford your own robot, fear not. Have a play on our free simulation instead.

 

Posted in Articles | Tagged , , , | Leave a comment

Humanoid robots, imitation learning and torque control

This week, we interview Head of the Department of Automatics, Biocybernetics and Robotics at the Jozef Stefan Institute in Slovenia, Ales Ude. As founder of the department’s Humanoid and Cognitive Research Lab, he has a particular interest in imitation learning – the way robots interact with and learn from humans.

In Ales’ words, “Imitation learning is not just about a robot repeating what it sees from humans, but extrapolating from this knowledge in order to generate new behaviours. We’re collecting libraries of human movements, transferring these to humanoid robots and generating new behaviours based on this data.”

Ales is currently working with our humanoid biped, TALOS. Launched in April 2017, the robot’s torque sensors, electrical power, 32 degrees of freedom and walking speed of 3km/hr make it one of the most advanced biped robots in the world.

Watch our interview to find out why Ales chose to work with TALOS, the highlights of his lab’s research over the last few years and how he thinks robotics will change our lives in the future.

 

Posted in Articles | Tagged , , , , , | Leave a comment