Can we empathize with social robots?
iCub, a child-like humanoid designed by the RobotCub Consortium, taken at VVV 2010
(Image by jiuguangw) Permission Details DMCA
We increasingly rely on intelligent machines in our homes, workplaces and transit systems. Smartphones, desktop, notebook and tablet computers, industrial robots, autopilots in large passenger airplanes, and soon-to-be self-driving autos have quickly become our new normal.
For the most part these intelligent machines seem comfortably alien and subservient to us. They're built to do what we command even when their power greatly exceeds what our brains can do.
However, we do have some nightmarish dystopian fantasies like those in the Terminator and Matrix film series. Imagine an internet-based fusion of the most powerful computer programs into a global Mind which controls over us through the very technology we depend on. Much of the dread in these fantasies is of a world in which people will be conquered by impersonal machines, by soul-less things we can fear but not respect.
Yet the computer revolution is not just about rapid growth in the raw calculating power of machines. It has lately gone in a new direction that challenges the popular belief that even the most intelligent machine is merely that--a thing rather than a person. This is the dawning of the age of social robots.
One striking sign of this new age was Spike Jonze's film Her (2013), a best-picture nominee and winner of Academy, Writers Guild and Golden Globe awards for best original screenplay. It tells the story of Theodore, a lonely man in the near future who buys an operating system designed to evolve through conversation with its user. The system, which names itself Samantha, speaks to Theodore with a soothing feminine voice, and the bond between them grows from friendship to romantic love.
Although current conversational software is still not as fluent and resourceful as Samantha, it is progressing rapidly. There is also a flurry of research on enabling computers to infer emotional states from human facial expressions, tones of voice and gestures, and to respond appropriately.
For $500 you can buy Jibo, billed as "the world's first family robot." The YouTube video is worth a look. Popular Mechanics had an excellent article on Jibo: "This Robot Means the End of Being Alone" (11/18/14). Here is some of the author's description:
"The machine itself is a 6-pound tabletop device, 11 inches tall. . . small, rounded, spritely, mischievous. There is a dark, 5.7-inch touchscreen where its face would be, and on it an orb that is remarkably adept at implying human facial expressions. Hidden away in the head are stereo speakers and microphones, as well as tracking cameras that will allow Jibo to identify and [visually] follow people around the room," reminding them of an appointment and relaying a message.
The gap between Her's Samantha and Jibo will be filled sooner rather than later by a mobile robot with humanoid body parts and increasingly sophisticated perceptual and conversational skills that enable it to recognize and display emotional states. Even if this creature doesn't actually ask us, its presence in our midst will pose a momentous question: do we recognize it as having thoughts and feelings?
If we answer yes, then how should we treat it? This theme received very intelligent and poignant treatment in Ridley Scott's " Blade Runner " (1982) and Steven Spielberg's " A.I. Artificial Intelligence " (2001).
Many people believe that a robot, however accomplished and amiable, is just a thing, not a person. Like the humans in Blade Runner and A.I., no matter how valuable and fulfilling their relationships may be with humanoid robots, they cannot forget that behind the smiling robotic face there is only electronic machinery. That's not a self.
Of course the self "behind" a human face is not the brain most of us first meet up with in high school biology. Scientists have learned much about which sorts of brain events give rise to specific kinds of mental events. Nevertheless, the relationship between what happens in brain tissue and what we experience as sensations, feelings and thoughts remains a mystery. Brains are measurable objects in a public world, whereas selves and their feelings are not.
Interestingly, the dominant scientific model of brain function is the computer, the very sort of machine that puts a smile on the face of our robotic conversational partner. As I pointed out in a previous op-ed (, researchers are trying to develop " cognitive neural prosthetics ," computational devices that would replace damaged brain tissues and restore their functions.
How could a future human with neural prostheses and other synthetic body parts (e.g. hips or knees) reject the very idea of a robotic self?