Behind the scenes of Robocup: hacking TIAGo

If you saw the modified TIAGo that competed at Robocup 2017 in Japan, you probably noticed that the robot looked a little different. That’s because it was modified by the Technical University of Munich, specifically for the Robocup@Home Open Platform League, and aptly renamed TIAGo@Home.

Keep reading to find out how the Alle@Home team hacked our robot.


Directional microphone

Although TIAGo comes equipped with a stereo microphone, the Alle@Home team added a large directional microphone on top of the robot’s head in order to pick up sound and enhance the robot’s speech recognition capabilities. Robocup can be pretty noisy, and the team wanted to make sure their robot could hear and respond to commands.

New computational capabilities

Our TIAGo robots always come with a laptop tray, but it’s up to clients what they do with it! Whether they use it to programme the robot, expand their world on Minecraft or rest a much-needed coffee – we’re not here to judge…

The Alle@Home team used the laptop tray of TIAGO@Home to add a new computer that enhanced the robot’s computational capabilities. In a (technical) nutshell, they used deep learning algorithms running over CUDA in a NVidia GPU to detect people and objects moving around the competition.

Artificial Skin

The team also added Artificial Skin – technology developed by scientists at the Technical University of Munich - to TIAGo@Home’s gripper.

The Artificial Skin was created to increase robots’ multi-modal sensitivity, so that they can better understand and interpret their environment. Comprised of a large number of small hexagonal boards, the skin features energy-saving microprocessors and sensors to detect changes in speed, temperature and touch.

The skin boosted TIAGo@Home’s sensory capabilities, for example by helping the robot to sense proximity to other objects in three dimensional space. These capabilities were crucial at Robocup, as they helped the robot detect the proximity of objects in grasping tasks and fine command the gripper in order to grasp and place them effectively.

Customized 3D printed gripper

The specific chaTIAGo_grasping_bottlellenges facing the Alle@Home team called for specific hardware hacks.

In order to grasp and move target objects, the team decided to 3D print new gripper fingers. Although TIAGo comes with an interchangeable five-finger hand or parallel gripper, the new gripper enabled TIAGo@Home to move the objects with more accuracy and precision in the specific Robocup@Home tasks.

USB hub

In order to connect the new microphone to the robot, the Alle@Home team added a USB hub. The hub is being powered externally by TIAGo’s expansion panel, which also contains a 12V connector providing 5A, a CAN service port and ethernet ports.

A world of no limits

TIAGo is an open platform with a huge array of possibilities for expansion. The robot’s modularity and flexibility, as well as its 100% ROS-compatible open-source software, mean that there are virtually no limits to the applications you can incorporate.

The Alle@Home team added software and hardware hacks to suit their own requirements, and successfully made it through the first rounds of Robocup. Congratulations, Alle@Home!

How would you hack your TIAGo? Let us know:

If you can’t afford your own robot, fear not. Have a play on our free simulation instead.


Posted in Articles | Tagged , , , | Leave a comment

Humanoid robots, imitation learning and torque control

This week, we interview Head of the Department of Automatics, Biocybernetics and Robotics at the Jozef Stefan Institute in Slovenia, Ales Ude. As founder of the department’s Humanoid and Cognitive Research Lab, he has a particular interest in imitation learning – the way robots interact with and learn from humans.

In Ales’ words, “Imitation learning is not just about a robot repeating what it sees from humans, but extrapolating from this knowledge in order to generate new behaviours. We’re collecting libraries of human movements, transferring these to humanoid robots and generating new behaviours based on this data.”

Ales is currently working with our humanoid biped, TALOS. Launched in April 2017, the robot’s torque sensors, electrical power, 32 degrees of freedom and walking speed of 3km/hr make it one of the most advanced biped robots in the world.

Watch our interview to find out why Ales chose to work with TALOS, the highlights of his lab’s research over the last few years and how he thinks robotics will change our lives in the future.


Posted in Articles | Tagged , , , , , | Leave a comment

2017: The year of the stock-taking robot

In the face of growing competition from e-commerce, brick-and-mortar stores are having to innovate. With lower costs, speed and ease-of-use on their side, online retailers are in prime position to take the lead in the retail world. So, what are stores doing about it?

Early adopters of automstockbot3ation and AI (think computerized check-outs and online product recommendations), retailers have been quick to incorporate new technological innovations into their existing systems and processes. 

From customer service to inventory-taking, we take a look at some of the ways robots are helping to extend the shelf life of shops.

The retail robot

In warehouses around the world, robots are already hard at work, busy delivering heavy packages faster and more easily than their human counterparts.

To take just one example, last year Target added autonomous robots created by Symbotic to one of its biggest U.S. distribution centres. Using ledges to travel up and down aisles, the robots can move and track cases autonomously, helping Target retrieve, record and restock items more quickly.

Robots have also been employed in delivery. Amazon famously made their first drone delivery (of a TV streaming stick and bag of popcorn) in the UK in December 2016, and it’s widely recognised that drones will play a key part in retail logistics in the future.

But robots are also making their presence felt in stores. In an increasingly fast-paced, digital world, the quality of customer service can make or break a retail brand. As a result, retailers are turning to robots to assist customers in shops, harnessing big data to provide real-time product information and even offer personalised recommendations.

From welcoming customers and directing them to a requested item to offering a checkout-free experience, the potential for service robots at all stages of the physical retail cycle is endless.

Time to start taking stock

Another area of retail that’s embracing automation is stock control and inventory. Over the last year, new RFID technology and a growing need to streamline and automate inventorying has pushed the issue to the fore – and PAL Robotics’ StockBot may be the solution.

Over-stretched retail staff struggle to find the time to perform regular inventories, particularly in chains with multiple stores spread across countries or even continents. This leaves retailers in the precarious position of not knowing exactly what they have in each store – or whereabouts in the store their products are.

According to a report by Supply Chain Digest, the out-of-stock rate experienced by consumers is 17.8% – that means close to 18 customers out of 100 are leaving stores unsatisfied. Keeping track of stock can be a major headache for retailers, but luckily it’s an issue robotic platforms, like Stockbot, are ready and waiting to assist with.

Benefits of automated inventory

The benefits of aDSC_1757utomation are exponential. Employing a robot to undertake inventory not only streamlines and speeds up the process, it also results in a more accurate picture of misplaced, lost and stolen items – and frees up staff to attend to customers.

In a pilot undertaken in a 1500m2 store, StockBot reduced the time it took to perform a full inventory by 80% (from five hours to one hour), with an accuracy of 99.10%. Thanks to RFID sensors, an autonomous mobile base and a pioneering combination of lasers and cameras, Stockbot is cheaper, faster and more accurate than a human performing the same task.

Importantly, automated inventory also means automated data capture. Collecting data on a daily basis has the potential to transform a business. Big data can provide retailers with a better understanding of shoppers’ habits and trends, as well as enabling data-driven strategies and more accurate long-term forecasts.

With 12 hours continuous battery life and full autonomous capabilities, StockBot can be deployed overnight so that retailers wake up to a full report – with the items’ positions – in their inbox before 9am. Imagine waking up to coffee, croissants and captured data – the perfect combination.

StockBot at the ready

Robots have the potential to revolutionise all areas of retail, and have been doing so for the last few years. From customer service to logistics, robotic platforms have been transforming the way retailers sell their products. Now, it’s the turn of the stock-taking bot.

If you’re interested in finding out how StockBot could help your business, we’d love to hear from you. Send an email to

Posted in Articles | Tagged , , , , , | Leave a comment

The definitive intern guide to SLAM: place recognition & loop closure

SLAM – simultaneous localisation and mapping – is a technique robots use to build a map of their environment, ascertain where they are within the map and assess whether they’ve been to a particular location before. It’s an incredibly important technique within navigation research, so important in fact that we currently have a team focusing on the topic.

We’ve interviewed two PAL Robotics interns, who are spending their six month placements helping to develop and test new approaches to SLAM. First up is Robotics Masters student from University of Genoa (Italy), Elena Rampone, who has kindly volunteered to answer our questions and explain how her research into loop closure and place recognition may help a robot recognise its environment.

This may be an impossible task, but how would you describe SLAM in less than 50 words?

A robot navigating in an environment needs to keep track of its movements by estimating its current position, building an internal map of the environment and recognising if it has already visited a particular place. This is SLAM. The estimations are performed by analysing data acquired by the robot’s sensors.

What exactly are you working on?

My project can be broken down into two steps. The first consists of adapting the company’s place-recognition framework, based on the Bag of Words algorithm, to 3D point clouds. Point clouds provide a 3D depiction of the world and are characterised by a compact descriptor, which allows us to efficiently compare them.

The whole system is ROS-based and tested using the Kitti dataset, a widely-used outdoor dataset of point clouds acquired by a Velodyne sensor.


[Image: a point cloud]

When the robot moves, the descriptors characterising new observations of the environment are inserted into a database that stores information about all the places the robot is visiting. The system continuously checks if a new observed cloud is similar to an antecedent one by comparing its descriptor with those present in the database and then selecting the corresponding clouds.


[Image: loop closure]

After the system selects the candidate clouds, it is necessary to assert whether the robot has returned to a previously visited location and, if so, estimate the relative transformation between the two clouds in order to update the position of the robot (loop closure).

The second step involves using the Iterative Closest Point (ICP) algorithm – a method that iteratively matches points between two clouds in order to find the relative transformation that minimizes an error measure. Any candidate clouds that correctly converge with ICP are selected as correct loop closures. The estimated pose will then be used to update the robot’s  position.

What are the next steps in your research?

I’ve already been interning at PAL Robotics for five months, so I’m coming to the end of my project. For the final few weeks I plan to keep testing system performance by using different methods to extract point cloud features, in order to identify the one that works best. I’m testing both the place-recognition and loop closure elements to ensure the end result is robust and – most importantly – useful for the PAL Robotics team.

I know you can’t give too much away, but are the results in line with what you expected?

With the help of open source libraries, I could come to a suitable trade-off between recognition accuracy and execution speed by comparing many different configurations in a semi-automated fashion. The results will allow me to deploy the solution on a TIAGo robot for real-world testing.

What have been the highlights so far?

My first major highlight was when I tested the ICP algorithm on a sequence of clouds and saw that it was correctly estimating the relative displacement between them and could also align them. But the most important moment was when the place-recognition system started working, as it’s a huge milestone in my internship.

Posted in Articles | Tagged , , , , , | Leave a comment

ICRA 2017: reflections on the world’s largest robotics event

TIAGo&Edgar selfie

The PAL Robotics team have just returned from yet another fantastically organised ICRA - the IEEE’s International Conference on Robotics and Automation.  This year, it was held at the Marina Bay Sands Expo and Convention Center in Singapore.

It’s hard to find an event that matches ICRA in terms of sheer size and scale. With 11 tracks, 900 paper presentations and over 40 exhibitors, it is a truly mammoth conference that mirrors the increasingly widespread interest in robotics. Hundreds of researchers from around the world gathered to discuss the latest developments, technologies and research that will enable robots to become a significant and functioning part of our everyday lives.

A PAL Robotics presentation

WhatsApp Image 2017-06-01 at 10.51.33

This year, one of our PhD students, Jeremie Deray, presented a paper outlining a new method for loop closure – a technique that enables robots to recognise places they’ve already been, in order to localise themselves in the environment and improve the quality of their map.

Jeremie, along with fellow students from the Institut de Robotica, explained that their research focused on applying an approach typically used to recognise images in computer vision to robots’ 2D laser readings. They exploited the graph topology of the environment by enforcing neighbouring place constraints, which in turn increased performance and recognition rate. Jeremie and the team proved that their proposed method is comparable to or better than existing algorithms.

It was a proud moment for the PAL Robotics team – and, of course, for Jeremie!

Our European projects

We wouldn’t be able to attend events like ICRA without the support we receive from the European Union. We are involved in a wide range of EU-funded projects, working with universities and companies across the continent in order to research how robots can improve our quality of life.

From growmeup and EnrichMe, which are researching how robots can help older people maintain active, independent lives for longer, to Factory in a Day, which is looking at ways to improve the competitiveness of European SMEs by decreasing robot installation time and cost. We are also involved in a European project called Co4Robots, which is hoping to identify methods by which robots can communicate in order to complete complex tasks.

Keynote highlights

With so many prBill Huang (Clouds Mind)esentations, keynote talks and plenary sessions, it was impossible to attend them all – let alone pick a favourite. Aside from Jeremie’s presentation (of course!), the keynote speech by Katja Mombaur on the difficulties of developing humanoid biped robots and the plenary session by Bill Huang on autonomous navigation in trucks were both outstanding.

We also attended workshopGordon Cheng (TUM, Munich)s on perception planning and control for legged robots and participated in discussions about the growing use of event-based cameras in robotics. As we have two biped robots, TALOS and REEM-C, understanding how to improve the way they navigate around large human environments is crucial. Biped robots need to know where they are and be able to localise themselves in their environment, as well as maintain their balance while moving over and around obstacles.

As Katja Mombaur explained in her keynote presentation, developing biped robots that can adapt to our existing environments is an incredibly difficult task – and it’s one that keeps the PAL Robotics team very busy!

Posted in Events | Tagged , | Leave a comment

TIAGo to the rescue: how robots can help an ageing society


This May, for a period of three weeks, we’ve had the pleasure of hosting two fellow roboticists, Guido van der Hart and Nicky Mol, from one of our partner companies based in the Netherlands: Heemskerk Innovative Technology (HIT).

Helping maintain independence

Over at HIT, Guido and Nicky keep themselves busy by trying to anticipate some of the problems that will arise as a result of the aging of western societies. One of the issues they foresee is that as our populations age, there will be fewer health care workers to care for a larger group of patients.

Deploying service robots, like TIAGo, that can perform simple manipulation, cleaning or logistics tasks can alleviate the workload of care professionals, allowing them to focus on their primary task of care-giving. Importantly, this may also allow patients to maintain a greater degree of autonomy and independence in their day-to-day lives, leading to reduced frequency and duration of hospitalization.

Working towards semi-automation

Due to the nature of the environments the robots have to operate in, achieving full automation in the healthcare industry, is incredibly complex. In order to accelerate the deployment of robots in hospitals and homes, HIT is therefore working on the near-term implementation of semi-autonomous robots.

To be classed as semi-autonomous, robots have to be aware of their capabilities and, in particular, of the actions they’re not able to perform without assistance. When autonomous capabilities fall short, the robot would need to inform an operator located at a global control centre. It is envisioned that in such a control centre, multiple operators could be monitoring and assisting hundreds of robots all around the world.

Enabling remote control

In order to work towards this goal, an early prototype of TIAGo was shipped to the Netherlands, where a team of HIT engineers have been developing a cockpit interface to remotely control the robot online, from anywhere in the world. The recent addition of a screen and microphone to the prototype will enable an operator to communicate with patients at the remote location.

Because TIAGo is equipped with a force/torque sensor on its wrist, the forces that are encountered upon interaction with the remote environment can be presented to the operator through a joystick that is able to display forces. In this way, the operator can actually feel what he is doing, which is incredibly useful when performing tasks that require contact with the environment.

Navigating time delays

One of the main technological challenges the team encountered was keeping the system stable in the presence of time-delays that occur when controlling the robot online. When a robot encounters its environment, a chain reaction of delayed impact forces cause the operator to bounce off the environment. Not only does this make the robot very hard to control, but it can also cause damage both to the robot and the objects around it.

During the time they spent working with the prototype of TIAGo, the team at HIT managed to implement a clever control method to resolve these challenges. We will go into more detail in an upcoming blog post, so stay tuned!


During their stay, Guido and Nicky were able to get their cockpit operational on the newest version of TIAGo. They also managed to perform a series of manipulation tasks while remotely operating the robot.

Having successfully demonstrated their work and traded knowledge with our software developers and engineers, they will now return to the Netherlands to continue developing their technology. Maybe, one day, there will be TIAGos helping people in need all around the world…

Posted in Articles | Leave a comment

ERL tournament: TIAGo Steel robots for loan

We’re incredibly excited to announce the call for loan of three TIAGo Steel robots to teams who want to enter the ERL Service Robots 2017 tournament. We will also give one additional robot for free to the team we believe shows exceptional quality of expertise and degree of innovation in their application paper and simulation test.

The deadline for applicaTIAGo Steeltions is 30 June, and the teams to be granted loan equipment will be announced on PAL Robotics’ website by the end of the month. The robots will be available for collection by mid-July 2017 and will cost €650 per month, plus tax.

If you have any questions about the application process, terms and conditions or capabilities of TIAGo, please email:

What is the ERL Service Robots tournament?

The ERL Service Robots tournament is one of three organized by the European Robotics League (ERL) to address the societal challenge of an aging population, strengthen the European robotics industry and push the ‘state of the art’ in autonomous systems for emergency response.

The tournament is made up of a series of competitions that take place across Europe. One competition will be organized by us, held in Barcelona during European Robotics Week, 20-26 November 2017. If any of the teams would like to compete in this tournament, we will provide a free TIAGo for them to use during the competition.

What is the application process?

Potential teams must send an application paper to (PDF preferred) by 30 June 2017. The paper must contain the following:

  • Team leader’s contact information. For university teams, contact information of an academic supervisor should also be provided.
  • CVs (max. 1 page) for each team member.
  • Financial Summary (max. 1 page on income and expenditure).
  • Executive Summary (max. 1 page).
  • Description of the motivations leading the team to ask for the loan.
  • Max 2 page description of the team’s specialties and expertise (past achievements, research projects, etc.), and/or videos of the team’s achievements (using real robots or simulators).
  • Details of which scenarios the team plans to participate in, how they will complete the different tasks and the intended algorithms that will be used.

What is the judging criteria?

The teams will be selected based on the quality of their application paper (quality and degree of innovation of the proposed algorithms, expertise of the teams, etc.). Teams must demonstrate a minimum experience with ROS middleware, and might also be asked to complete a simple simulation test with TIAGo.

We especially encourages teams with previous experience in robotics to apply. These teams will have to prove their capabilities in autonomous navigation and control.

Why TIAGo?

TIAGo Steel

TIAGo Steel is a mobile-manipulator with a 54 cm footprint, 3kg payload arm and 14 degrees of freedom (DoF). The platform uses standard open interfaces (ROS), has one day endurance, a high degree of modular flexibility and can be programmed remotely.

We’re offering this platform because its autonomous navigation, manipulation and perception abilities make it perfect for research into new robotic applications in the service sector. As it is a robust and reliable platform, we hope teams will be able to focus on the robot’s cognition, intelligence and autonomy, rather than engineering issues.

For full specifications, please email

What are the conditions of the loan?

Rent will be €650 euros a month, excluding taxes, and the rental fee will be taken off the cost of a new TIAGo Steel (€49.900) if the teams wish to buy the platform at the end of the project. PAL Robotics will also provide a TIAGo to the four competing teams at the local Barcelona ERL Service Robots tournament, to prevent expenses and issues that might arise during transportation.  

Once the team has been selected, a loan contract will be signed stipulating that the end-user will accept full responsibility for returning the items in good condition at the end of the competition, or upon the request of PAL Robotics. As a result, it is mandatory that the end-user obtains insurance to cover damage or loss of the equipment.

Furthermore, the end-user commits to rent the robot for at least three months and to participate in the ERL Service Robots tournament in Barcelona.

All that’s left to say is… good luck!

If you have any questions, just shoot us an email –

Posted in Articles | Tagged , , | Leave a comment

Five minutes with NASA Space Robotics Challenge finalists, Team Olympus Mons


In a galaxy far, far away (PAL Robotics’ office), a team of engineers have been dealing with the aftermath of a dust storm on Mars… There’s never a dull moment here at PAL, and now aligning the orientation of a communications dish, deploying a new solar panel and fixing a leak in a Martian habitat can be added to the team’s ever growing list of skills.

A team of current and former PAL Robotics’ employees have somehow found the time to enter NASA’s Space Robotics Challenge – and after undertaking a series of challenging preliminary rounds, they’re now one of the 20 finalists. With little experience of NASA’s software, and up against 91 other teams comprised of some of the brightest minds in academia, industry and government from around the world, reaching the finals in January was a huge achievement in itself.


The final competition will be held in a virtual environment that’s built to represent life on Mars, and teams must program a virtual robot, modeled after NASA’s humanoid Robonaut 5 (R5), to complete a series of tasks in a simulation. With only a month left to go, the team are working nights and weekends to complete the three challenges ahead of June 11.

You can follow their progress on Twitter.

We managed to grab five minutes with team lead, Victor Lopez, who filled us in on some of the highlights and challenges of the team’s foray into space.

  1. Why did you decide to get involved?

We’re all hugely passionate about robotics, so any chance to try new skills and use new software is massively exciting. The competition has also been a great learning experience because it has given team members a chance to work on issues (ie navigation, walking or grasping) that’s outside of their usual remit. We’re all geeks! So in many ways, this is the challenge of a lifetime. Who wouldn’t want to enter a competition run by NASA?

  1. Why is your team called OlympusMons?

Although we’re all current and former PAL Robotics employees, we’re now working in a wide variety of companies and backgrounds, so we wanted something neutral – and space related, of course! Olympus Mons is the highest volcano on Mars and the largest in the solar system. It is two and a half times the size of Everest (above sea level), which we felt was a very appropriate name as it often feels like a mountain of a challenge!

  1. What’s been your favourite moment so far?

Every time we reach a key milestone on a task it’s an exciting moment, as we’re one step closer to being able to successfully complete all three. But the moment we found out we’d qualified for the finals was particularly special – we hadn’t used R5’s software before, so we’re really proud to have got through to this stage. Our team is split into small groups, each working on a specific task or element, and it’s also been great to catch up with old friends and continue collaborating together.

  1. And the biggest challenge?

In order to simulate a realistic environment, NASA have added a 20 second delay to all communications – both to the robot and back to us – which makes everything harder to control. If the robot falls or performs an unexpected action, we won’t know until 20 seconds later. As a result, we’ve had to give the robot a great deal of autonomy and plan for all eventualities!

Because we’re not allowed to run our own ROS controllers on the robot, another big challenge has been to adapt the software we run on TALOS and REEM-C to go through the R5’s controllers. Learning how to use different tools and systems has definitely pushed us outside our comfort zones.

  1. What are the next milestones?

During the first week of June, we’ll have the opportunity to practice on the simulation. The final competition is the following week, so hopefully everything goes to plan or you will see some very tired, coffee-fueled PAL employees at the beginning of June! We have our own Twitter account, so keep an eye on our feed for OlympusMons’ news over the next few weeks.


Posted in Articles | Tagged , | Leave a comment

Letting TIAGo loose in The Hague for RoboBusiness

From kilts to clogs, TIAGo certainly seems to be living it large in 2017! We had a great time at RoboBusiness (thank you RoboValley for the kind invitation), and we’re very excited to see what lies in store over the coming months and years for the newly merged RoboBusiness and TUS Expo.

Factory in a Day

TIAGo wasted no time collecting Factory in a Day’s lego bag and proudly parading it around the RoboBusiness conference hall! A lively lego competition later ensued in the PAL Robotics office (winners still to be announced).

Part funded by RoboBusinessthe EU, Factory in a Day aims to improve the competitiveness of European manufacturing SMEs by removing the primary obstacle for robot automation: installation time and cost. The project, which finishes in September 2017, has been making huge strides towards its aim of reducing system integration time to just one day over the last four years, and we are proud to be a partner.

Robotics and healthcare

Robotics is starting to play an increasingly important role in the healthcare industry, and several of the European projects we’re involved in are rooted in healthcare.

EnrichMe is hoping to improve the quality of life for older people by means of technologies that enable independent health monitoring, complementary care and social support.

GrowmeUp is looking at ways to provide an affordable robotic system that can learn older peoples’ needs and habits over time – as their abilities and capabilities degenerate – so they’re able to maintain active, independent lives for longer.


Our CEO, Francesco Ferro, spoke in a panel on the subject with some of the most prominent roboticists in Europe: Maja Rudinac from Robotic Care Systems, Joris Jaspers from UMC Utrecht, Ivo Broeders from the Meander Medical Centrum at the University of Twente and Marcelo Ang, National University of Singapore. The discussion was well attended and thought provoking.


Making new friends (and catching up with old ones)

TIAGo never fails to make new friends at events, and RoboBusiness could well be the robot’s new personal record.


First up was local Dutch politician Thierry Baudet. Newly elected to the House of Representatives in The Hague, Baudet took a brief break from political duties to visit RoboBusiness – and particularly enjoyed learning about our robots’ capabilities and potential use cases. (Photo:



RoboBusinessAlthough TIAGo managed to acquire lego from Factory in a Day and even persuaded the nice humans at Droneland to practice their drone landing skills, the robot was unable to convince Tesla to lend PAL Robotics a new car – not even for a test drive around the World Forum conference centre!


We also buRoboBusinessmped into old friends Harmonic Drive, manufacturers of robotic reducers, who we’ve collaborated with over the years to make some of the most advanced robotic arms in the world (including TIAGo’s!). 


All in all, it was another great event for TIAGo and the PAL team. Thank you to everyone who came to visit our stand, and particularly to RoboValley for hosting us.

Next stop: Singapore.

Posted in Events | Tagged | Leave a comment

Ethics in robotics: an interview with Dr. Michael Anderson

Dr. Michael Anderson, from the University of Hartford, researches the principles and practices that inform the ethical behavior of autonomous systems. He is using our mobile manipulator TIAGo to undertake research involving machine ethics, and to develop and test some of his ethical theories.

Learning to trust Artificial Intelligence

“Artificial Intelligence might serve as a backup of human intelligence when Earth comes to its ultimate demise,” Dr. Anderson argues.

Technology might help us preserve something that has taken billions of years of natural design to achieve and may even turn out to be unique in the universe.

To garner trust in Artificial Intelligence, and thus permit its continued development, ethical values must be incorporated within artificially intelligent agents.

Incorporating ethical principles in limited domains

“We build principles, ethical principles that drive a robot’s behavior.”

Dr. Anderson says that incorporation of such principles in fully autonomous robots functioning within an unconstrained world present a very complicated problem, and suggests that we begin our efforts in simpler domains.

Currently, robots are being constructed to take part in limited domains, such as taking care of the elderly or taking inventory inside stores. These contexts are composed of a limited collection of actions and ethical duties, making it easier for the robot to take decisions.

The role of machine learning

“When ethicists discuss certain specific cases there’s often agreement. We use those cases to learn what the principles are underneath that agreement.”

Dr. Anderson gives an example: “Imagine that a robot has to charge, and at the same time someone asks it to play ball, or deliver a medicine that would prevent a great deal of harm to a person.

Every action that the robot takes satisfies or violates a collection of ethical duties. The robot would need to consider: what are the duties involved?”

Dr. Anderson and his team use machine learning to take the cases in which ethicists agree, analyze how these duties are balanced and then use the resulting abstractions to decide the ethically correct action to take in each situation.

The prima facie duty approach to ethics

Dr Anderson’s method is far from Asimov’s “Three Laws of Robotics”.

“Although the Asimov’s Laws themselves may hold some validity, they are represented as a hierarchy where the first law is always the most important. Furthermore, it is not clear that the duties represented by these laws are complete.

We use instead what is called the prima facie duty approach to ethics, where there is no one duty that always overtakes any other, since there are situations in which another duty might take precedence. Any number of duties can be equally taken into consideration.”

Setting ethical limits

Where should ethical limits lie for robots? Dr. Anderson asserts that robots should only appear in situations where there is agreement on the ethics involved.

“You should not put a robot in a situation in which the ethics are not yet clear. If we don’t understand the ethics involved, a robot should not be there – the ethics should come first.”

More details of Dr. Anderson’s research can be found on his website.

Posted in Articles | Tagged , , , | Leave a comment