Encoding Sensory Feedback for a Neuro-Prosthesis
Subramaniam Venkatraman, Kristofer Pister and Jose M. Carmena
Brain-controlled prostheses have the potential to improve the quality of life of a large number of paralyzed persons by allowing them to enact their voluntary motor intentions simply by thought. We are pursuing the scientific and technical developments that are required for making this technology accessible to users. This work requires development of neural interfaces as well as a better understanding of how the brain performs sensory and motor tasks. Techniques for analyzing the large quantities of data recorded from neurons and interpreting user intentions from them also need to be improved.
Excellent results demonstrating the ability of a monkey to control a robotic arm  and transmit information at a high data-rate  have been demonstrated. Hence, we now have the ability to "read" from the brain and decode its intentions to a limited extent. We are interested in exploring the use of cortical microstimulation to provide the user with information about the spatial location of the prosthetic device (proprioception) as well as tactile information from the end effector. In other words, this technology will allow us to "write" to the brain and thus provide feedback to the user of a prosthetic arm. Successful encoding of this sensory feedback should lead to realistic sensations and thereby increase performance accuracy while controlling a neuro-prosthesis.
- J. M. Carmena et al., "Learning to Control a Brain-Machine Interface for Reaching and Grasping by Primates," Public Library of Science Biology, Vol. 1, 2003, pp. 193-208.
- G. Santhanam, S. I. Ryu, B. M. Yu, A. Afshar, and K. V. Shenoy, "A High-Performance Brain-Computer Interface," Nature, Vol. 442, 2006, pp. 195-198.