Robot skin: The robot that can 'feel' real pain

Is it ethical to create a robot for the purpose of pain?

By Olivia Higgins

Screen Shot 2021-03-02 at 11.44.25 PM.png

A demonstration of a robot’s skin sensors. The red lights indicate sensations.

[Source: Technical University of Munich, 2019]

When we think of robots being able to ‘feel’ pain, it might seem like a pretty cool, albeit bizarre notion. We might also be forgiven for not taking the pain too seriously – it is just a robot, after all. However, creating a robot that is able to mimic pain - as close as possible to the human distressed equivalent - may be key to helping robots better empathise with us humans, and make them more compassionate friends in the process.  This is particularly important due to the increase in collaborative tasks in human-robot interaction over the past 15 years. By creating a successful ‘skin’ and software that evoke the right levels of pain, the robot may be better equipped with tools necessary to interpret, decode and assimilate with human behaviour.

In our previous article, we discussed how human-like robots can appear more creepy as they increasingly resemble humans - known as the Uncanny Valley Effect (UVE) - and how UVE can be used to develop robots that are less creepy and can better assimilate into society. In this article, we will delve deeper into this assimilation by looking at the creation of robots that can feel pain, and what this means for robot rights and AI consciousness.

Synthetic pain: Understanding pain in robots

Screen Shot 2021-03-02 at 11.46.30 PM.png

The child robot can express over 100 human expressions, pain being a primary research focus.

(Source: Youtube/Hisashi Ishihara, 2021)

Back in 2019, the Technical University of Munich’s (TUM) robotics team revealed a robot detector skin that visually shows different ‘sensations’ (pictured above, first photo).

Less than two years later, a team of scientists from Japan’s Robot Association, led by Minoru Asada, had taken this a step further and presented Affetto (pictured above, second photo) at the recent American Association for the Advancement of Science meeting in Seattle. Affetto is the first child robot with synthetic skin equipped with a new tactile sensor allowing it to ‘feel’ pain. Unlike the TUM robot that lights up its sensors when touched, Affetto’s tactile sensor embedded in the form of synthetic skin is multi-sensory and has a range of more advanced sensations.

The system works by applying an electrical charge to Affetto’s skin, sending an impulse to his programmed ‘nervous system’ to react to. This reformed synthetic skin is revolutionary compared to previous pain systems as it detects all types of pressure changes on the skin. For example, if we were to stroke his arm gently or punch him hard, Affetto would react accordingly based on the level of ‘pain’ he felt as a result of our action.

maxresdefault.jpg

Affetto reacts in pain as an electrical charge is applied to his skin.

(Source: Youtube/Hisashi Ishihara, 2021) 

According to the scientists: “…the primary goal is to construct a symbiotic society with artificially intelligent robots, and a robot that can feel pain is a key component of how we can achieve that assimilation.”

Dr. Hisashi Ishihara, one of the lead designers of the robot, added that both empathy and human characteristics should be a priority if robots and humans want to live in harmony for the long-term. Dr. Ishihara believes that robots will be more effective in social bonding with humans when we increase their sensitivity and expressivity in a physical form, and that one-day humans will create robots that will be indistinguishable from humans. In order to achieve this, however, a proper pain framework needs to be established across the entire tech/robotics field, as without it, there will always be a mismatch in human-robot interactions and communications, added Dr. Ishihara.

Dr. Ishihara said: “Many android robots can show smiling faces but we can feel something is wrong.”

His point relates back to the UVE in which a perfectly fine, harmless robot can make us feel unsettled. The perceived threat arises less from the fact the robot might be smiling, and more due to its perceived lack of empathy. It could be argued the smile actually makes this perception worse and causes the human to reject the initial social interaction with the robot. As such, the team seeks to improve overall human-robot interaction by making the robot more approachable and empathic through the lens of multiple sensations.

So with the advancement of empathy and skin sensations for these seemingly mechanical beings: is it odd then, that robots are now being created just to feel pain?

Protecting robots: The rise of robot rights

“In Japan, we believe all inanimate objects have a soul, so a metal robot is no different from a human in that respect - there are less boundaries between humans and objects,” expressed Minoru Asada, head researcher of the Japan Robot Association.

The same lies for leading Robot Rights professor, David J. Gunkel, who believes that we need to fundamentally rethink how we view robots. In his book, Robot Rights, he pushes to expand the philosophical debate on this matter.

In his recent article in the MIT Press Reader, Gunkel considers a lot of the mainstream arguments against robot rights to be founded upon unjust assumptions and problematic arguments. Firstly, there is a mischaracterisation that robots should have human rights. Secondly, and perhaps more importantly, the misconception that since ‘robot rights’ are not equatable to ‘human rights’ they are invalid. Both points of view and others are completely unjust, according to Gunkel.

Screen Shot 2021-03-02 at 11.44.39 PM.png

David J. Gunkel posted this photo to his Twitter timeline to advocate for robot rights.
(Source: Twitter/David J. Gunkel, 2020)

Coming back to the key question and topic of this article, whether it is ethical for a robot to be made for one purpose - to feel pain - Gunkel asserts that, “rights are not one kind of thing, they are manifold and complex.”

Tony Prescott, Professor of Cognitive Robotics at the University of Sheffield, said in response to Robot Rights that “robots are a new kind of entity – not quite alive, and yet something more than machines.”

While Robot Rights dissects the question of whether robots should have rights from every angle, Gunkel and his fellow advocates encourage a necessary, open conversation needs to be had - one that forces us to rethink how we approach AI ethics and our core perceptions of human-robot interactions - setting the stage for what may become one of the most important ethical debates of this century.  

Affetto-Child-Android.gif

A GIF of the child robot being activated for pain.

(Source: GIF/Hisashi Ishihara, 2021)

Developing a new framework to understand robot consciousness

Other scientists and roboticists such as Muhammad Anshar, who completed his Ph.D. in synthetic pain, believe that before we try to understand pain, there should be an industry standard on understanding robot consciousness. The self-awareness framework is a key part of Anshar’s research at the University of Technology Sydney’s, MagicLab.

Screen Shot 2021-03-02 at 11.44.46 PM.png

PhD graduate Muhammad Anshar interacts with his prototype robot using his new pain framework.

[Source: University of Technology Sydney]

His work focussed on developing an appropriate concept of pain for robots using a new framework called a simplified pain matrix method for AI activation and hoped that this would pave the way for a more standard industry model of regulating pain and detecting robot consciousness.

When a successful standard is eventually formed, it could potentially bring about a number of benefits for human-robot interactions, such as improving actual communication between humans and machines, where the machines are now able to empathize better with their human companions. In another instance, the improved pain framework will direct industry leaders for social robots to produce more ethically, having to consider both human and robot rights.

The rise of advocacy groups and experts leading this new robot rights debate allows for more open discussion into pain theory, what it means to be a machine and aids the paradigm shift into the future – how robots can seamlessly be integrated with humans.

In the next part of this article, we will assess AI consciousness and what it means for current companies.

Edited by Jamie R.C.