Send a Tweet
Most Popular Choices
Share on Facebook 18 Share on Twitter Printer Friendly Page More Sharing
Exclusive to OpEdNews:
OpEdNews Op Eds    H4'ed 11/1/16

Sensation in an artificial hand: another step toward our posthuman future

By       (Page 1 of 1 pages)   2 comments
Become a Premium Member Would you like to know how many people have read this article? Or how reputable the author is? Simply sign up for a Advocate premium membership and you'll automatically see this data on every article. Plus a lot more, too.
Author 31712
Message Brian Cooney
Become a Fan
  (21 fans)

Researcher Rob Gaunt prepares Nathan Copeland for brain computer interface sensory test.
Researcher Rob Gaunt prepares Nathan Copeland for brain computer interface sensory test.
(Image by UPMC/Pitt Health Sciences Media Relations)
  Details   DMCA
The summary in Science Daily of a recent breakthrough in neurotechnology was arresting: "Imagine being in an accident that leaves you unable to feel any sensation in your arms and fingers. Now imagine regaining that sensation, a decade later, through a mind-controlled robotic arm that is directly connected to your brain."

The patient (Nathan Copeland) is a quadriplegic who lost all motor and sensory function in his arms and legs in an accident 12 years ago. Sensory input from his limbs can't reach his brain because of damage to his spinal cord. He volunteered for a Brain Computer Interface (BCI) experiment at the University of Pittsburgh Medical Center (UPMC).

A team of researchers led by Robert Gaunt implanted arrays of microelectrodes in Copeland's brain in two areas that are involved in sensory input from, and motor control of, the hand. (A microelectrode is small enough to be able to penetrate the membrane of a single neuron, and can either record or induce the cell's characteristic electrochemical activity.)

The robotic arm was not attached to Copeland's body except through the wiring of the BCI. When the experimenter manipulated the fingers of the robotic arm, a blindfolded Copeland was able to identify which fingers or pairs of fingers were touched and pressed when he was presented with a schematic drawing of a human hand. He said the sensations felt "natural."

Gaunt's group was building on the work of another UPMC team that had enabled a human subject to control the movements of the prosthetic arm via BCI. This team had developed algorithms by which a computer connected to the implanted microelectrodes could translate patterns of brain activity into electronic signals that moved fingers and other parts of a robot hand and arm. In December of 2014 they reported that a quadriplegic patient (Jan Scheuermann) with "complete loss of upper limb motor control" was able to move "the almost human hand of a robot arm with just her thoughts to pick up big and small boxes, a ball, an oddly shaped rock, and fat and skinny tubes."

The precision Scheuermann showed in moving the robot arm was remarkable because her sensory feedback was limited to vision. When we move our arms and hands to grip an object, we normally feel what's going on--we have what are called "somatic" sensations. We see what we're doing, but we also get cutaneous sensation such as touch, pressure, temperature and pain, as well as proprioceptive sensations from muscles, tendons and joints that tell us the relative positions of adjacent body parts and strength of the effort we're making.

That is why, according to the Gaunt team's report, Scheuermann's "prosthetic limb movements were often slower than able-bodied movements, . . . as might be expected when somatosensation is absent and vision is the sole source of sensory feedback." Their limited goal was to enable their subject to experience touch and pressure in the robotic hand Scheuermann had moved. (They were not yet trying to elicit proprioceptive sensation.)

What the Gaunt team achieved was both very limited and yet momentous in its implications. It was limited in three ways: (1) the robotic arm was not a functioning prosthesis attached to the subject's body; (2) the range of somatic sensations was limited to touch and pressure; and (3) the sensations were not experienced as feedback from the subject's own controlling of the robotic arm.

To appreciate what was momentous in the Gaunt team's experiment, we need to analyze the difference between visual awareness of our own bodies, and how we experience them through somatic sensation. This will help us understand what Jennifer Collinger, a member of the UPMC team working with Jan Scheuermann, said in a personal communication to Sliman J. Bensmaia (Univ. of Chicago) in 2015: "Thought-controlled neuroprostheses without somatosensory feedback are experienced by patients as disembodied robots, despite the fact that they can control them by mere thought."

When I look at my hand in front of me, what do I see in it that makes me experience it as mine? The complete perception of my hand includes cutaneous and proprioceptive sensation. If I had voluntary control of my hand but only visual sensation of it, would I experience the hand as part of myself, or would it be just a thing I could move at will--something "disembodied"?

If I accidentally press the head of a pin, I could say that it hurt my finger or that it hurt me, since I hurt where my finger hurts. That finger, like any other skin area where a pin or needle penetrates, is part of me. Cutaneous sensations, by mapping onto the visual image of my body, are a major component of the experience of embodiment.

Somatic sensations belong to the category of feelings: types of experience that have in common a sense of self. Somatic sensations are localized feelings, linked to specific areas of our bodies; other feelings, such as sadness or joy, have little or no bodily location. Vision, on the other hand, is not a feeling. It is only the combination of seeing and feeling my body that yields the experience of embodiment.

These recent achievements in medical technology have as their goal to compensate people who have lost arms or legs, or sensorimotor function in those limbs from spinal damage. But they are moving us into a bionic future in which we will be able not just to replace what is broken, but also create bodies with enhanced capabilities. The transition will feel natural, since somatosensory feedback will enable us to experience these bodies as ourselves. The implications of this posthuman phase are dizzying.


Rate It | View Ratings

Brian Cooney Social Media Pages: Facebook Page       Twitter page url on login Profile not filled in       Linkedin page url on login Profile not filled in       Instagram page url on login Profile not filled in

I'm a retired philosophy professor at Centre College. My last book was Posthumanity-Thinking Philosophically about the Future (Rowman & Littlefield, 2004). I am an anti-capitalist.

Go To Commenting
The views expressed herein are the sole responsibility of the author and do not necessarily reflect those of this website or its editors.
Writers Guidelines
Contact AuthorContact Author Contact EditorContact Editor Author PageView Authors' Articles
Support OpEdNews

OpEdNews depends upon can't survive without your help.

If you value this article and the work of OpEdNews, please either Donate or Purchase a premium membership.

If you've enjoyed this, sign up for our daily or weekly newsletter to get lots of great progressive content.
Daily Weekly     OpEdNews Newsletter
   (Opens new browser window)

Most Popular Articles by this Author:     (View All Most Popular Articles by this Author)

Are humans creating a posthuman future?

The Mythology of Individualism

The elephant in America's living room

Trump's mental instability threatens national security

Is capitalism inherently anti-democratic?

Walmart, Waltons epitomize America's class war

To View Comments or Join the Conversation: