REVIEW

Brain-computer interface: the future in the present

Levitskaya OS1, Lebedev MA2
About authors

1 Cyber Myonics, Moscow, Russia

2 Department of Neurobiology,
Duke University, Durham, North Carolina, USA

Correspondence should be addressed: Olga Levitskaya
ul. Marshala Biryuzova, d. 30, kv. 45, Moscow, Russia, 123060; ur.liam@stivel_ailo

Received: 2016-03-11 Accepted: 2016-03-25 Published online: 2017-01-05
|
Fig. 1. A BCI-based robotic arm capable of grasping objects
Extracellular activity of cortical neurons was recorded by a multi-electrode array implanted into several cortical areas of the monkey. Signals were decoded using Wiener filters and then transmitted to the robotic arm controller. On the screen, the monkey was presented with a cursor that changed its size depending on the gripping force the animal applied. The task was to reach toward a virtual object after it appeared on the screen and to grasp it. In one task the monkey controlled the robot using a hand-held joystick with two degrees of freedom, and the gripping force was determined by how strongly the joystick was gripped. In another task the joystick was not connected to the robot, and the robot was controlled directly by the commands issued by the motor cortex (Carmena et al., [28]).
Fig. 2. Reproduction of kinematics of bipedal walking based on ensemble cortical activity
Activity of neuronal ensembles of monkey sensorimotor cortex was recorded while the animals were walking on a treadmill. Blue curves represent movements recorded by video tracking system; red curves represent decoded movement (Fitzsimmons et al., [30]).
Fig. 3. The schematics of the first brain-computer-brain interface
The motor area of the control loop sets the cursor in motion. The desired position of the cursor is decoded on the basis of motor cortical activity. The sensory part of the loop serves as a feedback tool. It transmits artificial tactile signals into somatosensory cortex through intracortical microstimulation (O'Doherty et al. [80]).
Fig. 4. Integration of brain activity of several subjects using a brain net
Each monkey was seated in a separate room and watched a virtual arm on the screen; the task was to touch the object using the virtual arm (A). Signals from various cortex areas were recorded by a 700-channel invasive electrode array. After decoding, the signals were sent to the virtual arm, with monkeys contributing to coordinates equally (B) or with each monkey controlling only one coordinate (C) or one plane (D). The tasks were performed more effectively compared to the experiment where only one animal controlled the virtual arm (Ramakrishnan et al., [81]).