Berkeley Electrical Engineering and Computer Sciences

Close your eyes and touch your finger to your nose. Underlying this seemingly simple action is a sophisticated algorithm that uses complex sensory feedback to continuously adjust the firing rates of the motor neurons coordinating the movements. For Jose Carmena, the goal is to understand this feedback loop well enough to enable a physically handicapped person to control a robot arm.

UC Berkeley EECS Professor Jose Carmena (Photo by Peg Skorpinski)

EECS Professor Jose Carmena. (Photo by Peg Skorpinski)
Carmena works on "brain-machine interfaces" or BMIs, devices which intercept neural signals and use them to directly control a computer cursor or a prosthetic limb. The field is still in its infancy, but the hope is that one day, people will be able to operate artificial limbs as naturally as they do real ones. Ultimately, such devices could even become the human-computer interface of the future, but for now, most BMI research focuses on primates.

Carmena wants to develop BMIs that not only enable patients to perform complex tasks, such as tossing a ball or grasping and lifting objects of different sizes and weights, but also give them proprioception, a sense of where the limb is in space relative to the rest of the body. Ideally, "the patient will 'feel' what the robotic arm is grasping and where the robotic arm is in space," says Carmena. "This is not what we have today, but it's where we are moving."

Before coming to Berkeley in 2005, Carmena was part of a group at Duke University that demonstrated that Macaque monkeys could use a BMI to reach and grasp with a robot arm. The researchers first trained the monkeys to do three types of tasks using their own limbs to operate a joystick. Then, they implanted ensembles of hundreds of electrodes in the frontal and parietal lobes of the monkeys' brains. Using linear models, the researchers simultaneously extracted a variety of motor parameters, such as hand position, velocity, and gripping force, from the activity of the monkeys' neurons while they performed each task and then altered the apparatus so that tasks were performed directly from the models' output. Eventually, the monkeys realized that their physical movements had no effect. "They would sit back and rest their arms" and control the joystick with their brains alone, says Carmena.

Remarkably, operating a BMI affects the brain's structure. The influence of a particular neuron on a movement will adapt over time, Carmena and colleagues demonstrated, which raises an issue that Carmena finds particularly intriguing: whether it's possible to program a brain. "We've shown that we can decode signals from the brain and that it adapts in response to feedback, but can we encode information from the prosthetic device back into it?" he asks. "This is unexplored territory."

One of Carmena's current projects is to improve the technology used in BMIs. Recently, a group of researchers at Brown University implanted electrodes in the brain of a 25-year-old volunteer human patient, enabling him to open email and move a cursor on a computer screen. But before BMIs can be routinely implanted in humans, researchers must address issues with the implanted electrodes: Current arrays work for no more than a year because small movements of the recording electrodes, over time, damage the neurons and weaken the signal.

Together with Robert Knight, director of Berkeley's Helen Wills Neuroscience Institute, Carmena is exploring a different, less invasive type of electrode technology in which the electrodes would attach under the skull, resting on the surface of the brain without penetrating it.