You are on page 1of 8

Bionic Research Background

For decades, scientists have been interested in developing a


technique for interpreting brain activity to motor output — in
other words, decipher the brain's electric patterns and convert
them into coherent thought.
Miguel Nicolelis, and Jose Carmena
The first attempts at a brain-computer interface were performed with robot arm (Credit: Duke University)

in the late 1960s and early 1970s. Human subjects were able to
control the generation of certain brain waves, called alpha
waves, that were picked up by an electroencephalogram (EEG), a noninvasive apparatus used to measure
the brain's electrical activity, giving insight to certain states of mind, voluntary intentions, and visual stimuli.
Similar experiments were later successfully conducted with beta, mu, and theta rhythms. However, to
achieve greater accuracy in reading brain patterns, a more invasive method was needed to be able to read
signals, or neurons, from individual brain cells. In the early 1970s, the pioneering experiments of Eberhard
Fetz and his colleagues from the University of Washington demonstrated that monkeys could be trained to
control neural patterns picked up by an electrode inserted adjacent to the neuron being monitored. A few
years later, Edward Schmidt from the United States’s National Institutes of Health raised the possibility that
voluntary motor commands could be extracted from neural patterns and implemented to help severely
paralyzed patients operate a prosthetic device.

It took nearly two decades after Schmidt's proposition until the technical bottlenecks that had to be passed
were overcome and technologies were developed that enabled the recording of neural patterns from multiple
sites. These technologies allowed the demonstration that neuronal activity patterns could be recognized
using pattern-recognizing algorithms and, later, the first demonstration that neuronal population recordings
in rats and rhesus monkeys could actuate a robotic device with a single degree of freedom. In a 1999
groundbreaking experiment from the laboratory of Miguel Nicolelis from Duke University, primate arm
reaching was reproduced by reading brain patterns.

The past decade has provided an overwhelming amount of knowledge and scientific breakthroughs at a
stunning pace, giving hope that brain-machine interfaces (BMIs) will be put into clinical use in the very near
future.

Brain-Machine Interface – An Inside Look

The majority of motor functions in our body are driven by


electrical currents originating in the brain motor cortex and
conducted through the spinal cord and peripheral nerves to the
muscles, where the electrical impulse is converted to motion by
the contraction and retraction of specific muscles. For example, to
bend the arm at the elbow joint, the biceps muscle contracts and
the triceps relaxes. This seemingly simple movement is the result
of the cumulative activity of many brain cells in the area of the
cortex in charge of arm movement. The neurons, following a
A neural network in the brain
demonstrating the complexity
of neural connections
(Credit: Williamette University, Oregon)
cognitive decision to bend the arm, generate an electric impulse through the peripheral nerves, causing the
correct muscles to contract or relax.

The term used for neuronal activity is "action potential." Action potential occurs when an electric impulse
shoots through the long shaft of the neuron, called the axon. Each neuron has one axon but is connected to
many other neurons through chemical connections called synapses, and can influence other neurons or be
influenced itself by the activity of adjacent neurons, creating an extremely complex network of neural cells.

The action potential in a neuron can be measured by inserting an extremely thin electrode adjacent to the
axon, where the passing electric current can be detected. The electrode measures the neuron’s rate of
action potential in one second, thus measuring its activity.

Most neuroscientists agree that the rate or frequency of the firing constitutes a sort of code for brain activity.
For instance, if a certain group of neurons fires action potentials at a high frequency together, the result is
the movement of a limb.

Noninvasive BMIs

BMIs can be divided into two main groups: invasive and


noninvasive. Noninvasive BMIs rely on reading the brain's
activity without actually piercing the brain surface. The EEG is
one of the earliest noninvasive BMIs, measuring the combined
activity of massive groups of brain neurons through voltage
differences between different parts of the brain. The EEG is
performed by placing approximately 20 electrodes on the scalp;
these electrodes are connected by wires to an amplifier,
Mind controlled gaming (Credit: Emotiv)
through which the signal is converted to a digital reading, which
can then be filtered by a computer to remove any artificial
interference. Once connected to the EEG, the subject can be shown different stimuli, and the brain’s
electrical patterns in response to the stimuli can be studied.

Some EEG BMIs rely on the subject’s ability to develop control of their own brain activity using a feedback
system, whereas others use algorithms that recognize EEG
patterns that appear with particular voluntary intentions. Virtual-
reality systems have been used to supply patients with efficient
feedback systems, and subjects have been able to navigate
through a virtual-reality setting by imagining themselves
walking or driving. These systems can also be used for gaming
as previously described on TFOT.

EEG-based BMIs have been implemented to help patients


suffering from body paralysis, such as the motor-neuron
EEG- a non-invasive method of disease ALS. By generating certain brain patterns that are then
establishing a BCI. Subjects are
read by the EEG, patients are able to control a computer cursor
hooked into a virtual reality setting
while their brain activity is monitored by and indicate their intentions, and thereby communicate with the
an EEG. Subjects train using the
biofeedback setting to manipulate
the virtual reality using their thoughts alone

(Credit: Rochester institute of technology)


external world. EEGs are also reported to have enabled severely disabled tetraplegic patients grasp an
object using a paralyzed hand. In these cases, the patient generated certain brain waves that were detected
by an EEG and converted into external electrical muscle stimulation, which allowed the contraction of the
muscles and movement of the paralyzed limb.

EEGs have many shortcomings, due to much overlapping of electrical activity in the brain as well as
electrical artifacts. To achieve better resolution, electrodes can be inserted between the skull and the brain,
without piercing the brain tissue, and can allegedly achieve a higher resolution of brain activity. Although
noninvasive BMI techniques can improve the quality of life for some disabled patients by allowing them a
limited and slow capacity of communication, they are unlikely to hold the solution for allowing patients to
perform complex tasks that involve multiple degrees of freedom, such as controlling a robotic arm. These
activities will be more likely achieved through invasive techniques.

Invasive BMIs

While noninvasive BMIs achieve a vague picture of a subject's intentions and state of mind by reading the
activity of massive groups of neurons, invasive BMIs are able to read the activity at a much higher
resolution, up to the activity of a single neuron.

The main challenge in creating a BMI is deciphering the firing "code" and converting it to a meaningful
movement. An electrode inserted into the appropriate region of the brain cortex can measure the voltage of
many adjacent cells, and with the use of electronic filters, it can isolate the voltage of a single cell. It is then
potentially possible to determine the "preferred direction" of a certain cell in a human's brain by measuring
the firing rate as the subject moves his or her hand in different
directions. The preferred direction of a cell is the direction the
hand is moving when the firing rate of that cell is at its
highest.

In clinical trials involving monkeys, the problem with this type


of reading was it varied greatly from trial to trial, even though
the movement executed was virtually identical. A monkey may
be able to move a computer cursor using a BMI the exact Diagram representing the calculation
of a population vector - in D the
same distance, from left to right, and a given neuron being "preferred direction of a single cell
measured may fire at rate X, but the next time the movement is marked, the total frequency
is summed up giving the population vector
is performed, the neuron could fire at rate Y. Despite this, (Credit: Yale University)
averaging across many trials reveals fairly consistent firing
patterns. By measuring the preferred direction of many cells in
the vicinity of the electrode, averaging out a population of neurons, it is possible to extrapolate a "population
vector," the sum of the preferred directions of many cells, that can give us a preferred direction and signal
intensity of the cell population, which can, in turn, imply the intended movement of the limb.
It is not possible at the moment to aim electrodes to the exact location in each subject's brain; furthermore,
the brain’s architecture and cell layout differ from subject to subject. Therefore, in order to move a robotic
limb by thought alone, there must be a dual process of learning. On one side, a computer, using a learning
algorithm, adjusts itself to the subject's brain activity; and on the other side, the subject uses the brain’s
plasticity and learning abilities to urge cells surrounding the electrodes to generate movement. This process
is similar to tool usage in primates. Primates have the almost unique ability to assimilate the use of tools to
achieve their goals. When an ape uses a new tool to reach for a banana, or when a human learns to drive a
car or play the piano, a learning process and visual and tactile stimuli are involved, and the motor cortex
remaps the firing of neurons to gain proficiency.

The Current Status of BMIs

As far back as 2003, 53-year-old Jesse Sullivan, an electrical


technician who lost both arms in a work accident, was the first
amputee to be fitted with a bionic arm able to perform simple
tasks (video). The nerves that normally innervated the arm
muscles were surgically relocated to the pectoralis muscle in
Sullivan’s chest, where electrodes were able to pick up nerve
impulses being sent out to the no-longer-existing arm muscles.
A computer then converted the nerve impulses to gross arm
movements. Newer devices that are similar and use electrical
activity in the remaining arm muscles to operate a bionic hand
are commercially available from the United Kingdom–based
company Touch Bionics.
Jesse Sullivan’s bionic arm is controlled by peripheral nerve
Jesse Sullivan, double amputee endings connected to his chest muscles, but this solution
from Tennessee, tests new DARPA arm
(Credit: Rehabilitation Institute of Chicago) doesn’t work for patients without nerve endings, such as
patients with cervical spine injuries. Furthermore, movements
are limited by surgical abilities to rewire nerves. The greater
challenge is to be able to control a prosthetic device using direct readings from the brain.

One of the most stunning achievements in BMI was teaching primates to control a robotic actuator by
reading the neuron firing pattern. The animals eventually stopped moving their own limbs, being able to
perform tasks using the mechanical actuator alone (video).

Brain control of robotic arms has reached such an advanced level that primates can be trained to control a
robotic arm and feed themselves a banana using thought alone. (video).

Algorithms that improve the interpretation of neuronal signaling are being constantly developed along with
neuronal-reading capabilities. Scientists are able to read up to 100 electrodes simultaneously, allowing
higher accuracy in recording the brain's intentions. Several companies are working on the miniaturization of
computational hardware needed for the reading, interpreting, actuating, and learning the complex neuro-
bionic interface, and are reducing the weight and bulk of batteries needed.
Commercial companies are working on similar technologies for
more severely disabled patients, like those who suffer from
ALS. The United States company Cyberkinetics has initiated
clinical trials in human patients for a brain probe that would
allow patients to communicate through a computer by moving a
cursor on the screen using only thoughts (video).
It is obvious the main challenge in the future will be the issue of Scheme of connecting a monkey's
brain to a robotic arm
long-term functionality of the electrode, which will have to be (Credit: Duke University
tackled by biological solutions. There is also a need for Department of Neurobiology)
developing a fully implantable array of electrodes that can read
from multiple neurons — more neurons mean more accuracy —
and minimizing the size and bulkiness of the equipment, making it safe for use without risking brain damage
or infection. A solution currently being tested on animals is the use of wireless communication between the
brain probe and the computer, and this will probably be the most practical alternative to the inconvenience of
being plugged into an array of wires.

Once these major obstacles are passed, we could be looking at BMIs and bionic limbs in clinical practice in
the very near future.

The Future of BMI

One of the next challenges in the field of BMI prosthetics is


making them feel like normal limbs. A normal limb has a sense of
touch and proprioception, the process by which sensory feedback
to the brain transmits the location and position of the body's
muscles, allowing us to be aware of the arm’s position without
having to look. This is accomplished by an array of receptors in
the muscles and joints, as well as mechanical receptors in the
skin, that enable us to know when we are touching an object. The
next generation of prosthetic arms will have proprioception and
“feeling,” generating feedback pulses to the brain or to nerve
Utah's electrode array endings that will result in their bearers having an almost natural
(Credit: University of Utah)
feel to their bionic limb.

It seems that today, more than ever, BMIs that can operate bionic prosthetics are within our grasp. The
Defense Advanced Research Project Agency (DARPA) has set an ambitious goal of releasing a fully
functioning bionic arm for Food and Drug Administration (FDA) approval by 2009. This arm will have far
more degrees of freedom than any other available prosthetic, and in 2011 DARPA is planning to release a
prosthetic that has nearly all the motion ability and dexterity of a normal limb, including touch and
proprioception. Theoretically, an amputee using this arm will be able to play the piano.
A future type of BMI for patients with paralyzed limbs or spinal
cord injuries will send efferent motor impulses directly to the
muscles of the limb. Unlike the situation of amputees, in spinal
cord injuries, the muscles are functional but nerve impulses
aren’t able to get there. A muscle-stimulating BMI will be able to
bypass the severed point and directly innervate the muscle
through small electric currents. Robotic arms and hands
Normann artificial vision
approaching the agility and sensitivity of the human hand (Credit John A. Moran eye
center, University of Utah)
already exist and have been covered recently by TFOT. (see
here and here).

BMI technologies are not only confined to prosthetic and paralyzed limbs. In the future, BMIs may allow blind
people to see using an artificial picture-capturing device, much like a camera. Several methods for visual
prosthetics have already been used successfully with patients. These methods use a computer chip
implanted on the retina that is fed by a miniature camera on a patient's glasses. The chip stimulates the
optic nerves, transmitting a picture to the brain. Devices used today allow patients to see vague shapes or
distinguish light from dark, but future devices, such as the Cortical Visual Prosthesis being developed by the
Illinois Institute of Technology, will allow improved synthetic vision.

The John A. Moran Eye Center at the University of Utah has developed a chip
that interfaces with the visual cortex, but it could also be applied to other BMI
applications. The chip contains an array of electrodes that can be individually
stimulated, are small enough to be inserted into brain tissue without much
damage, and at the same time are strong enough to withstand the insertion
procedure. Some of these implants have been successfully implanted in blind
people with positive results. Future generations of these types of devices will
lead to improved resolution, and ultimately restoration of sight to the blind.
What we are witnessing today is only the tip of the iceberg of the great
potential BMIs hold for medical, military, recreation, and other purposes in the
Professor Eilon Vaadia
(Credit: Hebrew University) future. BMI research is on the threshold where science meets science fiction.
There will surely be exciting news emerging from this field in the very near
future.

Interview with Professor Eilon Vaadia

TFOT recently interviewed Professor Eilon Vaadia from the Department of Physiology in the Hebrew
University of Jerusalem’s Faculty of Medicine to understand more about current and future research in the
field of BMI.

Q: Could you please briefly describe your research and future goals?

A: We are working on:


• Understanding how the brain controls movements.
• How the brain learns new motor skills.
• How the brain copes with perturbations; for example, how do you know the force required to lift
carton of milk if you don’t see how full or heavy it is?
• Using BMI to study the above.
• Using BMI in future clinical application.

Q: Much of BMI research is conducted on primates. For some people, this may seem cruel or
unethical. Do you feel that using primates is necessary, and are there any alternatives?

A: It is absolutely essential [to test BMI research on primates] if we think one day we will use it in humans.
The research attempts to see how the brain learns, and simulation or machines cannot help in this. It is
trivial to interface “neural network” with computers, but it does not teach us how to do it in a real brain. The
only alternative is humans, and I am very much against experiments on humans. (Some doctors or scientists
may want to do it, or are already doing it.) On the other hand, mankind will have to decide if improving and
saving human life allows animal research. Ethics and morals should be reexamined and tested in times.

Q: What are the advantages of using an invasive BMI to operate prosthetics over the already-
available noninvasive bionic limbs that utilize the severed nerve endings?

A: Whenever possible, “bionic limbs” of equal performance are the preferred solution. BMI is only useful
when other solutions don’t provide the desired goal.

Q: What are the major bottlenecks that need to be passed before we can see a commercially
available BMI operating a prosthetic limb?

A: The most important one is the quality of implantable devices to monitor electrical activity of neurons with
high spatial and temporal resolutions. The rest is “just” engineering that requires lots of funds and hard work.

Q: Future prosthetic limbs will allegedly allow their bearers a sense of proprioception and touch.
How will this be made possible?

A: Various solutions, though long-range research is required. The major way is through smart (adaptive,
responsive, biological-like) electric stimulation of nerves, the spinal chord, and the brain.

Q: Operating a robotic actuator by thought could theoretically be done either by "decoding" the
brain's activity patterns for voluntary action or by a process of learning, in which the subject
"rewires" his neural patterns to adjust to the system. Which of these mechanisms do you believe is
the most dominant?

A: The answer is long and at the heart of our research. I think we need a lot of both: good decoding
algorithms that are highly adaptive and can learn, and methods to facilitate plasticity and appropriate
changes of neuronal activity. From what we see till now, the brain is a good collaborator — it changes itself
quite rapidly to adapt to the machine.

Q: Do you believe that implementing BMI technology in humans will be easier or harder than in
animals, considering the easier communication and better learning skills with humans?

A: Indeed it may be easier in humans, especially when there are better, more stable technologies for
monitoring neuronal activity.

Q: What is your vision for BMIs in the next 10 to 50 years, and how far do believe the technology will
reach in the more distant future?

A: Depending on a revolution in monitoring brain electrical activity, and revolutions in our understanding of
neural code, the limit will have to be dictated by ethics rather than technology. In the very long range,
humans could drive cars, cook, read while asleep, and transmit thoughts to others by BMI, if humans want to
go this way. The ethical issues are extremely important.

You might also like