Apostolos P. Georgopoulos,
M.D., Ph.D., Director,
Center for Cognitive Neurosciences,
University of Minnesota
Recent news reports have described what sounds like a miracle--restoring the ability of a paralyzed woman “to feed herself chocolate and move everyday items using a robotic arm directly controlled by thought, showing a level of agility and control approaching that of a human limb.”1 In fact, neuroscientists have actually made this possible by defining the specific steps in this process:
(1)- identifying those brain signals that contain the requisite movement signal information for such use;
(2)- developing suitable means to record those signals safely and continuously;
(3)- extracting the essential motor command information by computer processing of the neural signals;
(4)- designing, developing and constructing a prosthetic arm that performs nearly all of the functions of the human arm;
(5)- interfacing the processed brain signals to the device, creating an integrated Brain-Machine Interface (BMI);
(6)- finally improving the quality and effectiveness of the BMI by training the brain to operate the prosthetic limb. The latter phase of training also evaluates additional brain signals and additional features built into the device to improve brain-machine communication and enhance the brain-machine integration. The resultant operations achieve more accurate, faster and enduring usefulness of the prosthetic device[1,2].
It is obvious that this is a multidisciplinary endeavor, encompassing such diverse fields as neurophysiology, material sciences, mechanical and electrical engineering, and computer science, to mention but a few. No wonder it took many years before tangible progress could be made towards a practical and useful application. Ultimately, the goal is to enable the long-term safe and useful application of the BMI technology to human subjects in need of a prosthetic arm.
To that end, a large effort has been expended in developing and testing BMIs in animals, mainly behaving monkeys. Although today the results of human testing are beginning to appear, the bulk of our knowledge on BMI development, application and evaluation has come from studies in monkeys.
Initially, the achievements in this field were driven by the desire to replace an amputated limb by the control of a prosthetic arm that could substitute for the amputated limb. It was an educated guess that signals recorded in the motor cortex, the area that commands and controls arm movement, would be the right ones for the job. However, there was a problem. Although electrical stimulation of the motor cortex was known to evoke muscle contractions since the 19th century, this fact alone was not useful for prosthetic control because the use of our arms is embedded in surrounding space, e.g. to reach somewhere; and the use of our hands is embedded in object manipulation, e.g. to grasp a cup of coffee. Thus, having at our disposal motor cortical signals activating muscles would not help in prosthetic control; what was lacking was the knowledge on how such signals would command movements of the arm in surrounding space.
Work on motor cortical activity in behaving monkeys, from 19664 to 1982 was preoccupied with its relations to muscle contraction or simple joint movements, without reference to extra-personal space. But, in 1982, it was discovered that the activity of neurons in the motor cortex varied in an orderly fashion with the direction of arm movement in space, such that cellular activity for a given neuron was highest when the arm was moved in a certain direction (the cell’s “preferred direction”) and decreased progressively with movements farther and farther away from the cell’s preferred direction.
This basic discovery can be said to have been the birth of the directional tuning in motor cortex, and, later on, in all motor-related areas of the brain. In turn, the motor directional tuning is at the root of all current arm neuroprosthetic control[1,2].
The directional tuning curve is symmetric around the preferred direction (the peak of the curve), and, therefore, cannot provide a unique specification of the direction of movement, since cell activity would be the same on either side of the peak. However, different motor cortical neurons have different preferred directions, covering the whole 3-D movement space. The movement information is contained in the population of motor cortical cells, each of which has a different preferred direction, covering, as an ensemble, the whole directional space. This recognition of the information in the activity of populations of neurons led to the calculation of the neuronal population vector, which combined the directional tuning and the diversity of preferred directions to yield an outcome that predicted accurately the direction of intended arm movement in space. Thus, by referring motor cortical activity to space (via the directional tuning curve) and putting together weighted contributions of individual cells (along their preferred direction), a spatial measure (the neuronal population vector) was obtained out of purely temporal, non-spatial neuronal activity.
The neuronal population vector had another remarkable property (beyond its accuracy) which paved the way to its use in modern neuroprosthetics, namely that it provided the correct directional information well before the movement was produced by as much as 50-100 msec. This means if you were an observer placed in the motor cortex, you could accurately predict the upcoming movement, millisecond-by-millisecond. The scientists working here found that by calculating the neuronal population vectors at short time intervals into a continuous tip-to-tail sequence, the “neural movement trajectory” could be computed and visualized[9,10].
The directional tuning and neuronal population vector essentially solved the problem of how to extract meaningful directional and trajectory information from the motor cortex and translate it to useful control signals. The next challenge was to develop safe recording devices suitable for high quality, chronic recordings. Such devices have been developed during the past 20 years and recordings from dense multi-electrode arrays can now be carried out both in monkeys and humans[1-3]. What remains uncertain is the long-term safety of the implants and the assurance for high quality recordings over time. By definition, such implants are foreign bodies which may elicit, over the years, local reaction in the brain tissue with unforeseeable effects. Although no such effects have been observed as yet, the prognosis for a multi-year application remains to be determined.
The other concern is the maintenance of high quality recordings long term, which remains to be determined. Such considerations have led to the development of other approaches for obtaining neural information, specifically by recording electrical potentials from the cortical surface. This is a very fast-evolving field.
The development of multi-electrode arrays has been complemented by the fast-advancing computer processing of the recorded spike trains for use by prosthetic arm controllers. The progress here is nothing short of impressive. The demonstration of reaching and grasping in humans[1,2] and monkeys using motor cortical signals to control a prosthetic arm and hand is convincing and presages more such successful applications in the future.
An important trend is the effort to extract more and more information from the neural population signals to achieve and improve control of more complex functions of the prosthetic arm and hand.
A promising approach is the tapping of other cortical areas for that purpose, in addition to the motor cortex. This will provide the extra information but at the expense of additional implants with any potential associated risks of side effects. Finally, it should be noted that, following implantation and the availability of good quality chronic recordings, a gradual improvement of the BMI performance should be expected over time due to continuous adaptation and learning.
Ultimately, the limit of usefulness in BMI lies with the kind and quality of prosthetic arm that can be produced. Ideally such an arm would be as “anthropomorphic” as possible (i.e. it should be almost like a human arm, including the hand, with all associated functionalities).
An important advance emerged from the Revolutionizing Prosthetics program launched by the Defense Advanced Research Projects Agency (DARPA) in 2006. According to DARPA’s website“after six years of development, the Revolutionizing Prosthetics program developed two anthropomorphic advanced modular prototype prosthetic arm systems, including sockets, which offer increased range of motion, dexterity and control options.” The use of such sophisticated arms in neuroprosthetics is awaited eagerly.
In summary, BMI is a fast-evolving field with great promise. Advances are continuously being made in all fronts including: recording methods; extracting information from brain signals; perfecting prosthetic arms; and creating the perfect intercommunication between brain and the prosthetic arm.
It is a triumph of neuroscience; engineering; technology; and medicine, still to reach its limit.
- Collinger et al 2012 Lancet doi: 10.1016/S0140-6736(12)61816-9. [Epub ahead of print]
- Hochberg et al 2012 Nature 485: 372-37
- Velliste et al 2008 Nature 453: 1098-10101
- Evarts 1966 J Neurophysiol 29: 1011-1027
- Georgopoulos et al 1982 J Neurosci 2: 1527-1537
- Georgopoulos 1995 Trends Neurosci 18: 506-510
- Schwartz et al 1988 J Neurosci 8: 2913-2927
- Georgopoulos et al 1986 Science 233: 1416-1419
- Georgopoulos et al 1988 J Neurosci 8: 2928-2937
- Schwartz 1994 Science 265: 540-542
- Andersen et al 2010 Annu Rev Psychol 61: 169-190
- Taylor et al 2002 Science 296: 1829-1832