Can Robots Feel Your Pain?

Not yet, but empathic androids are the next wave in robotics.

Perhaps no quality seems more human than our ability to empathize with others. Yet today scientists and engineers around the world are developing an oxymoron for the 21st century: the empathic robot. 

It sounds like science fiction, but for years researchers around the world (especially in Japan and Korea) have been trying to build autonomous, human-like machines that could serve as domestic servants and provide around-the-clock care to the elderly or terminally ill—services that will become invaluable, they imagine, as the world’s elderly population skyrockets in the coming decades. Electronic caregivers will need to be able to detect emotional signs of distress or anxiety—in order, for instance, to provide companionship and offer gentle reminders to take medication.

[Robots] are increasingly being designed to serve as pets, nurses, office assistants, tour guides, teachers, domestic servants, and even emotional companions,” says Kwan Min Leeof the University of Southern California, who studies communication between humans and machines. These new applications for robots have caused an important shift in the study of human-robot interaction. “Rather than viewing robots as mere tools or senseless machines, researchers are beginning to see robots as social actors that can autonomously interact with humans in a socially meaningful way.”

To do that, machines will not only need to be able to understand which emotions a human is feeling. They’ll also need to respond with an emotionally appropriate behavior—be it through facial expressions, body posture, gaze direction, voice, or touch. These are all methods the robots Kismet (pictured) and Leonardo use to communicate emotion with their human companions.  Designed and built by Cynthia Breazeal and her colleagues at the Massachusetts Institute of Technology Media Lab, Kismet and his cute, furry successor Leonardo are touted as the world’s first empathic androids.  The cuddly Kismet uses three digital cameras, three microphones, 21 encoders, and a host of mechanisms to “interact physically, affectively, and socially with humans in order to learn from them,” explains Breazeal in her article “Toward Sociable Robots.” Kismet is programmed with “stimulation drives” that cause it to seek out companionship and play with toys. It can even get bored: When no one talks to it, the robot-head’s eyes sweep the room, looking for someone to play with.

The design of Kismet’s social “brain” was influenced by University of Cambridge psychiatrist Simon Baron- Cohen’s work on autism, in which he identified four brain modules—Intentionality Detector, Eye Direction Detector, Shared Attention Mechanism, and Theory of Mind Mechanism—that are necessary for everyday social interaction.

“Kismet was a breakthrough in the design of social robots, in that unlike previous robots, it was the first robot equipped with those modules needed for normal human social interaction,” says Lee. For example, Kismet’s equivalent of an “eye direction detector” module allows him to follow a human’s gaze, focusing his attention on the same object the human is looking at—and then react appropriately to that stimulus.

The designs of sociable robots like Kismet have also been shaped by University of California, San Francisco, social psychologist Paul Ekman’s Facial Action Coding System, a taxonomy of human facial expressions. Breazeal and colleagues actually drew on scientific maps showing how emotion is expressed in the human face and voice, then endowed Kismet with “face actuators” and an “articulatory-based speech synthesizer”  Kismet’s raised brows, for example, are designed to mimic human “attentional activity” that expresses both fear and surprise. When described in technical jargon, Kismet sounds rather cold—but lab experiments by Lee and other researchers, including Cory Kidd at MIT and Cliff Nass at Stanford, have shown that people can warmly interact with sociable robots and even derive emotional satisfaction from mechanical company.

Though robotics is now drawing on psychological research into human social and emotional intelligence, Lee believes that one day the knowledge will flow in the other direction. “In the future, I believe studies on social robots will give us many new insights on the nature of our social brain,” he says. “Social robots can be used as an excellent simulation tool to investigate the nature of human emotion, empathy, and social interaction.”

 

 

https://greatergood.berkeley.edu/article/item/can_robots_feel_your_pain?utm_content=buffer4e510&utm_medium=social&utm_source=twitter.com&utm_campaign=buffer#.Waq2g63kaok.twitterhttps://greatergood.berkeley.edu/article/item/can_robots_feel_your_pain?utm_content=buffer4e510&utm_medium=social&utm_source=twitter.com&utm_campaign=buffer#.Waq2g63kaok.twitter



Leave a Reply