Publication Videos

20242021202020192018201720152012201120082006 | BrainGate in the Media ↗
2024 | Brain control of bimanual movement enabled by recurrent neural networks (Deo et al. Sci Rep)
Simultaneous bimanual control of two cursors via RNN decoding. In this movie, participant T5 uses a BCI to control two cursors in real-time to targets on a computer monitor. An RNN converts neural activity into velocities for both cursors at each timestep.
Sequential unimanual movement vs. simultaneous bimanual movement. The same as Supplemental Movie 1, except T5 uses two different movement strategies: (1) sequential unimanual (moving one cursor at a time), and (2) simultaneous bimanual (moving both cursors simultaneously). A separate RNN decoder is used for each movement strategy.
RNN vs. linear decoder for two-cursor control. The same as Supplemental Movie 1, except with only unimanual trials. An RNN decoder is compared to a linear decoder for online control of two cursors. This task was limited to unimanual trials to focus on the differences between decoders.
Online two-cursor control with raw and temporally altered training data. Same as Supplemental Movie 1, except with only unimanual trials. During this task, one cursor is cued on any given trial where the other cursor stays ‘locked’ in place. This version of the task was used to focus on the differences between decoders. One decoder was trained with raw training data and the other decoder was trained with temporally altered training data.

2021 | High-performance brain-to-text communication via handwriting (Willett et al. Nature)
Copying sentences in real-time with the handwriting brain-computer interface. In this video, participant T5 copies sentences displayed on a computer monitor with the handwriting-brain computer interface. When the red square on the monitor turns green, this cues T5 to begin copying the sentence.
Hand micromotion while using the handwriting brain-computer interface. Participant T5 is paralyzed from the neck down (C4 ASIA C spinal cord injury) and only generates small micromotions of the hand when attempting to handwrite. T5 retains no useful hand function.
Freely answering questions in real-time with the handwriting brain-computer interface. In this video, participant T5 answers questions that appear on a computer monitor using the handwriting brain-computer interface. T5 was instructed to take as much time as he wanted to formulate an answer, and then to write it as quickly as possible.
Side-by-side comparison between the handwriting brain-computer interface and the prior state of the art for intracortical brain-computer interfaces. In a prior study (Pandarinath et al., 2017) participant T5 achieved the highest typing speed ever reported with an intracortical brain-computer interface (39 correct characters per minute using a point-and-click typing system). Here, we show an example sentence typed by T5 using the point-and-click system (shown on the bottom) and the new handwriting brain-computer interface (shown on the top), which is more than twice as fast.

2021 | Effects of Peripheral Haptic Feedback on Intracortical Brain-Computer Interface Control and Associated Sensory Responses in Motor Cortex (Deo et al. IEEE Trans Haptics)
Experimental block of trials demonstrating neural control of a computer cursor with continuous skin-shear haptic feedback driven by cursor motion. The experiment task is referenced in the article “Effects of Peripheral Haptic Feedback on Intracortical Brain-Computer Interface Control and Associated Sensory Responses in Motor Cortex”, as the ‘Haptics’ condition during closed-loop cursor control. Specifically, the video depicts the ‘Haptic’ Block 17 from trial day 688.

2021 | Home Use of a Percutaneous Wireless Intracortical Brain-Computer Interface by Individuals With Tetraplegia (Simeral et al. IEEE Trans Biomed Eng)
The video shows participant T5 using the wireless iBCI to point and click in the Windows native on-screen keyboard to free-type in the Windows Notepad app.
The video shows participant T10 using the wireless iBCI to point and click in apps (Pandora, NCAA) on a tablet computer running Windows. The camera view shows T10 lying on his side in bed (with a red neck pillow). The tablet is mounted a meter or so away on a flexible holder.

2020 | Replay of learned neural firing sequences during rest in human motor cortex (Eichenlaub et al. Cell Rep)
A Real-Time Video of Participant T10 Performing the Sequence Game, Related to Figure 1. In this clip of four consecutive sequences, the repeated sequence (blue-yellow-teal-red) occurs three times and a control sequence (red-blue-teal-yellow) occurs once.
A Real-Time Video of Participant T9 Performing the Sequence Game, Related to Figure 1. In this clip of four consecutive sequences, the repeated sequence (red-teal-yellow-blue) occurs three times and a control sequence (red-blue-yellow-teal) occurs once.

2020 | Hand Knob Area of Premotor Cortex Represents the Whole Body in a Compositional Way (Willett et al. Cell)
Real-Time, Discrete Neural Decoding of Attempted Movements from among 16 Possible Directional Movements Spanning the Wrists and Ankles.
Real-Time, Discrete Neural Decoding of Attempted Movements from among 32 Possible Movements Spanning the Hands, Arms, Feet, and Legs from Both Sides of the Body.

2019 | Neural ensemble dynamics in dorsal motor cortex during speech in people with paralysis (Stavisky et al. eLife)
Example audio and neural data from eleven contiguous trials of the prompted syllables speaking task.
The progression of neural population activity during the prompted words task is summarized with dimensionality reduction chosen to highlight the condition-invariant ‘kick’ after the go cue, followed by rotatory population dynamics.
The same neural trajectories as Video 2, but aligned to acoustic on (AO), are shown from 3.5 s before AO to 1.0 s after AO.

2019 | Volitional control of single-electrode high gamma local field potentials by people with paralysis (Milekovic et al. J Neurophysiol)
T6 day 453 exploration block 1. The participant explored different actions and evaluated the quality of cursor control gained from each of the actions.
T6 day 453 test block 3. The participant controlled the cursor to reach the targets. The score (top left corner of the screen) increased when the cursor overlapped with the target.

2018 | Cortical Control of a Tablet Computer by People with Paralysis (Nuyujukian, Sanabria, & Saab et al. PLoSONE)
Participant T6 – web browsing & email
Participant T9 – video search & streaming music
Participant T5 – chat & weather
Tasks of Interest – T6 piano & T9 calculator
Cross-coast iBCI chat between T6 and T9

2018 | Inferring single-trial neural population dynamics using sequential auto-encoders (Pandarinath et al. Nature Methods)
Generator initial states inferred by LFADS are organized with respect to kinematics of the upcoming reach.
LFADS reveals consistent rotational dynamics on individual trials.
Multisession LFADS finds consistent representations for individual trials across sessions.

2018 | Stable long-term BCI-enabled communication in ALS and locked-in syndrome using LFP signals (Milekovic & Sarma et al. J Neurophysiol)
Session T6 473, 3rd copy-phrase block using historical decoder.
Session T2 523, 1st copy-phrase block using historical decoder.
Session T6 473, 3rd copy-phrase block using same-day decoder.
Session T2 523, 1st copy-phrase block using same-day decoder.
Session T2 524, 1st free-spelling block without word completion.
Session T2 523, 1st free-spelling block with word completion.
Session T6 493, 1st free-spelling block with word completion.
Session T6 493 normalization block.
Session T6 473, 2nd open-loop calibration block.

2018 | Rapid calibration of an intracortical brain–computer interface for people with tetraplegia (Brandman et al. J Neural Eng)
Demonstration of rapid calibration with GP-DKF in T5 (0-60 seconds). Demonstration of 30 seconds of Grid Task. Trial Day 30
Demonstration of rapid calibration with GP-DKF in T8 (0-60 seconds). Demonstration of 30 seconds of Grid Task. Trial Day 662
Demonstration of rapid calibration with GP-DKF in T10 (0-60 seconds, 150-180 seconds). Demonstration of 30 seconds of Grid Task. Trial Day 112

2017 | Restoration of reaching and grasping movements through brain-controlled muscle stimulation in a person with tetraplegia: a proof-of-concept demonstration (Ajiboye & Willett et al. Lancet)
FES+iBCI single-joint movements done with virtual reality feedback (for comparison with virtual reality arm movements). Video taken 420 and 421 days after implant.
FES+iBCI multi-joint movements (elbow and hand) done with virtual reality feedback (for comparison with virtual reality arm movements). Decoder output, stimulation patterns, and joint angle measurements are shown. Video taken 337 days after implant.
Functional task video of drinking from a straw. The participant successfully completes a reach-to-grasp movement to acquire a cup of coffee. He brings the cup to his mouth and takes a drink, and then returns the cup. Video taken 392 days after implant; the participant had practiced this task for six sessions before the video was taken.
Participant performing a self-feeding task for the first time. Using a modified fork (a standard rehabilitation device), the participant successfully moves his arm and hand between a plate of mashed potatoes and his mouth, taking several bites.
FES+iBCI audio-cued single-joint movements done just before the functional task was attempted. Achievable single-joint movements were compared with FES+iBCI system turned off vs turned on. Video taken 402 and 421 days after implant.

2017 | Feedback control policies employed by people using intracortical brain–computer interfaces (Willett et al. J Neural Eng)
Real movements from an example block with participant T6 are compared to simulated movements generated with the piecewise feedback control model. The visual appearance of the game matches what T6 observed on the computer monitor.
Analogous to video 1 but for participant T7.
Analogous to video 1 but for participant T8.

2017 | High performance communication by people with paralysis using an intracortical brain-computer interface (Pandarinath & Nuyujukian et al. eLife)
Example of participant T6’s free-paced, free choice typing using the OPTI-II keyboard. T6 was prompted with questions and asked to formulate an answer de novo. Once presented with a question, she was able to think about her answer, move the cursor and click on the play button to enable the keyboard (bottom right corner), and then type her response. In this example, the participant typed 255 characters in ~9 min, at just over 27 correct characters per minute. One of two audible ‘beeps’ followed a target selection, corresponding to the two possible selection methods: T6 could select targets using either the Hidden Markov Model-based ‘click’ selection (high-pitched noises) or by ‘dwelling’ in the target region for 1 s (low-pitched noises). The plot at the bottom of the video tracks the typing performance (correct characters per minute) with respect to time in the block. Performance was smoothed using a 30 s symmetric Hamming window. The scrolling yellow bar indicates the current time of that frame. During the free typing task, T6 was asked to suppress her hand movements as best as possible. (During the quantitative performance evaluations, T6 was free to make movements as she wished.) This video is from participant T6, Day 621, Block 17.
Example of participant T6’s ‘copy typing’ using the OPTI-II keyboard. In the copy typing task, participants were presented with a phrase and asked to type as many characters as possible within a two-minute block. T6 preferred that the cursor remain under her control throughout the task – i.e., no re-centering of the cursor occurred after a selection. This video is from participant T6, Day 588, Blockset 2. Performance in this block was 40.4 ccpm.
Example of participant T6’s ‘copy typing’ using the QWERTY keyboard. Same as Video 2, but using the QWERTY keyboard layout. This video is from participant T6, Day 588, Blockset 4. Performance in this block was 30.6 ccpm.
Example of participant T5’s ‘copy typing’ using the OPTI-II keyboard. Same as Video 2, but for participant T5. This video is from participant T5, Day 68, Blockset 4. Performance in this block was 40.5 ccpm.
Example of participant T5’s ‘copy typing’ using the QWERTY keyboard. Same as Video 4, but using the QWERTY keyboard layout. This video is from participant T5, Day 68, Blockset 2. Performance in this block was 38.6 ccpm.
Example of participant T7’s ‘copy typing’ using the OPTI-II keyboard. Same as Video 2, but for participant T7. T7 selected letters by dwelling on targets only. In addition, T7 preferred that the cursor re-center after every selection (i.e., following a correct or an incorrect selection). These across-participant differences are detailed in Materials and methods: Quantitative performance evaluations (under ‘Target selection and cursor re-centering’). This video is from participant T7, Day 539, Blockset 3. Performance in this block was 10.6 ccpm.
Example of participant T7’s ‘copy typing’ using the ABCDEF keyboard. Same as Video 6, but using the ABCDEF keyboard layout. This video is from participant T7, Day 539, Blockset 1. Performance in this block was 16.5 ccpm.
Example of participant T6’s performance in the grid task. This video is from participant T6, Day 588, Blockset 3. Performance in this block was 2.65 bps.
Example of participant T5’s performance in the grid task. This video is from participant T5, Day 56, Blockset 4 (Block 28). Performance in this block was 4.01 bps.
Example of participant T5’s performance in the dense grid task (9 × 9). This video is from participant T5, Day 56, Blockset 4 (Block 30). Performance in this block was 4.36 bps.
Example of participant T7’s performance in the grid task. This video is from participant T7, Day 539, Blockset 2. Performance in this block was 1.57 bps.


2015 | Neural Point-and-Click Communication by a Person With Incomplete Locked-In Syndrome (Bacher et al. Neurorehabil Neural Repair)
Participant S3, Trial Day 1976, typing “the quick fox” under neural point-and-click control. This phrase was part of the sentence “the quick fox is lazy now”, which she also previously typed as part of a copy-spelling task.
Participant S3, Trial Day 1976, typing “keep hope” in response to the question “do you have anything you would like to say to the world?” as part of her final research session in the BrainGate clinical trial.

2015 | Clinical translation of a high-performance neural prosthesis (Gilja et al. Nat Med)
Composite video of representative Radial-8 and mFitts1 task trials from participant T6. For mFitts1, the cursor jumps to the center of the previous target at the start of each trial. For Radial-8, the cursor jumps only if the previous target was not acquired.
Composite video of representative Radial-8 and mFitts1 task trials from participant T7. For mFitts1, the cursor jumps to the center of the previous target at the start of each trial. For Radial-8, the cursor jumps only if the previous target was not acquired.
A free-pace free-choice typing task with the Dasher keyboard interface. See Supplementary Figure 5 for a summary of text written during this control session with participant T6.

2015 | Neural population dynamics in human motor cortex during movements in people with ALS (Pandarinath et al. eLife)
Neural population responses show rotational activity. Video shows the evolution of the neural state over time in the first jPCA plane for participants T6 and T7. Low-dimensional projections were calculated as in Figure 1. Each colored trace represents one of 8 conditions. All times are relative to target onset (0 ms).

2012 | Reach and grasp by people with tetraplegia using a neurally controlled robotic arm (Hochberg et al. Nature)
Neuronal ensemble control of the DLR robot arm and hand for three-dimensional reach and grasp by a woman with tetraplegia (S3), trial day 1959 (April 12, 2011). Two minutes of continuous video shows the participant using the BrainGate system to control three-dimensional movements and hand grasp. She was instructed to grasp the target. In this video, which represents some of her best neural control of the DLR arm, six targets were presented in sequence. She successfully grasped the target on trials 1,3,4, and 6, but only touched the target (which counted as a target acquisition, but not a grasp) on trials 2 and 5. The researcher in the background releases control of the system at the beginning of each block and is positioned to monitor the participant and robot arm. A small LED, located at the base of the DLR arm, was lit to indicate the brief periods where neural control of the limb was suspended. During this period, which occurred after each trial, the hand endpoint was computer positioned precisely at the software-anticipated target location, which then became the next trial’s start position (a method utilized to improve the collection of target path metrics). For clarity, a yellow dot (added to the original video) appears in the lower right corner of the screen whenever the small LED is lit; the dot is green at all other times, indicating full neural control of the limb.
Neuronal ensemble control of the DEKA prosthetic arm and hand by a woman with tetraplegia (S3), trial day 1974 (April 27, 2011). Two minutes and 54 seconds of continuous video showing the participant using the BrainGate system to control three dimensional movements and hand grasp. In this video, which represents some of the best control displayed of the DEKA arm, eight targets are presented in sequence that the participant was instructed to grasp. She successfully grasped the target on all trials except trial 4, in which she successfully touched but did not grasp the target. The LED is lit to indicate the periods whereeither (a) neural control of the DEKA arm is suspended, as occurred after each trial, or (b) a grasp state command was decoded and 3D movement of the arm was briefly suspended during the grasping motion. The third trial demonstrates an instance in which she successfully acquired the target, but the system software did not register this correct acquisition because the actualtarget location was different than the computer’s estimate of its location. Therefore, a new target was not presented until the timeout was reached. This trial was nevertheless scored during video review as a successful grasp. A yellow dot (added to the original video) appears in the lower right corner of the screen whenever the small LED is lit; the dot is green at all other times, indicating full neural control of the limb.
Neuronal ensemble control of the DEKA prosthetic arm and hand by a gentleman with tetraplegia (T2), trial day 166 (November 22, 2011). Three minutes and 51 seconds of continuous video shows the participant using the BrainGate system to control threedimensional movements and hand grasp. In this video, which is representative of his control of the DEKA arm, eight targets are presented in sequence that the participant was instructed to grasp. He successfully grasped the target on all trials except for trials 5 and 6, in which he successfully touched but did not grasp the target. The LED is lit to indicate the periods where either (a) neural control of the DEKA arm is suspended, as occurred after each trial, or (b) a grasp state command was decoded and 3D movement of the arm was briefly suspended during the grasping motion. A yellow dot (added to the original video) appears in the lower right corner of the screen whenever the small LED is lit; the dot is green at all other times, indicating full neural control of the limb.
BrainGate-enabled use of an assistive robot by S3 to drink a beverage using neurally-controlled 2-D movement and hand state control of the DLR robot arm, trial day 1959 (April 12, 2011). The video begins with the first successful reach, grasp, drink, and replace trial. Neural control of the movement of the DLR arm is enabled only within the plane of the table. After the participant successfully grasps the bottle under neural control (state command), it is raised directly upward off the table under pre-programmed computer control. 2D neural control, parallel to the tabletop plane, is then resumed. If a grasp command is issued when the arm is in a small subset of the workspace immediately near the participant’s mouth, the wrist pronates to allow her to sip from the straw (her usual method of drinking, as she does not have adequate motor control of her mouth to drink directly from a glass). After drinking the coffee, she issues another ‘grasp’ state command, which supinates the wrist to return the bottle to an upright position, at which point 2D neural control is resumed. When she has positioned the hand back over the table to the desired location, she issues a final grasp command, which lowers the bottle, releases the hand, and then withdraws the arm. After the first successful trial, there were two aborted trials (one due to a technical error by a researcher not preparing the hand to initiate a grasp in response to a proper command, the other due to the potential for pushing the bottle off the table, not shown); this was followed by the second and third successful trials, which occurred in succession. On the third trial, a researcher placed his hand near the bottle out of concern that it might be pushed off the table, but in fact the participant successfully grasps the bottle and then drinks from it. This was followed by an aborted trial due again due to the potential for pushing the bottle off the table (not shown), and then a fourth successful trial. The yellow dot in the lower right corner indicates times when the participant issued a grasp command; the dot remains yellow until 2D control is returned, which was dependent upon the phase of the task. 2D control was returned automatically after the bottle was picked up or placed back down on the table; 2D control was also returned if a grasp command was issued when the participant’s prior command was to supinate the hand after having just pronated it to take a drink.

2011 | Single-neuron dynamics in human focal epilepsy (Truccolo et al. Nat Neurosci)
Spiking activity on the microelectrode array (subject A, seizure 1). The movie shows the spiking rate of one single unit per electrode in the microelectrode array as a function of time. The largest unit recorded in each electrode was selected. Seizure onset, based on ECoG inspection, is at time zero. Electrodes at the darkest blue locations did not record activity that could be sorted into single units. Spiking rates are shown in spikes per second and were estimated based on 100-ms time bins. The ECoG at four locations is shown below. Location of electrodes OccS2 and GR50 are shown in Fig. 1, main text. The other two are nearby electrodes.

2011 | Neural control of cursor trajectory and click by a human with tetraplegia 1000 days after implant of an intracortical microelectrode array (Simeral et al. J Neural Eng)
One round red target at a time was presented on the computer screen and the participant moved the neural cursor to the target and performed a neural click to select it.

2008 | Primary Motor Cortex Tuning to Intended Movement Kinematics in Humans with Tetraplegia (Truccolo et al. J Neurosci)
Participants were asked to track a computer cursor moving on a monitor, as if they were controlling its position by intending to move their dominant arm or hand, similar to controlling a computer cursor via a hand held mouse (videos 1 & 3). The information from this task was used to decode movement intention from neural activity in the next task, where the participants were instructed to imagine moving a circle-shaped cursor displayed on the screen to one of four peripheral targets (videos 2 & 4).

2006 | Neuronal ensemble control of prosthetic devices by a human with tetraplegia (Hochberg et al. Nature)
Center-Out task. The goal of this task is to move the neural cursor to the location of the target (“money bag”, the target preferred by MN). The cursor must be held over the target for 500ms in order to register a success. Trial day 86. This was the best recorded day of Center-Out performance.
Video showing use of a computer interface with the neural cursor. Opening and closing simulated email using the neural cursor, while reading aloud, and drawing a circle with a neural cursor-enabled “Paint” program. The neural cursor first is used to open two simulated email messages. Selection is made only by passing the cursor over the icon. This is followed by three attempts to paint a circle with the neural cursor. The email and initial two circle drawing attempts are continuous video. Approximately 8 minutes later, after several other tasks were performed, the third circle drawing attempt was performed. Trial day 98. Mail icon selection was typical of his performance over the several days this task was provided. He was able to complete a loop on each day, from the first day requested; the third circle shown here was the most symmetric drawn.
Neurally-controlled television. MN uses the neural cursor to operate a television remote control via a computer interface; commands are sent to the television via an infrared system (Spitfire, Innotech Systems, Port Jefferson, NY). This is the first day that MN was shown this TV controller. Trial day 86.
Neural “Pong”. Continuous control of the 1 dimensional “neural paddle” is shown. Trial day 70. This was the first day that he was shown the Pong task.
Neural “HeMan” game. The object is to capture the treasure chests while avoiding the square obstacles. Trial day 90. This was the third day that MN played the HeMan game.
Direct neural control of a prosthetic hand. MN was initially instructed to move a neural cursor “up” to open the hand, and “down” to close the hand. While at first he looked at both the video monitor and prosthetic hand, he disregarded the monitor after a few trials and regarded only the hand while manipulating it in real time. He is describing aloud what he intends the hand to do. Trial day 114. This was the first day that MN was shown the prosthetic hand.
Transport of an object from one location to another via direct neural control of a multi-articulated robot arm. MN was asked to grab a piece of candy with the robotic arm, then to place it into the hand of an operator. Neural control of the robot arm is achieved by directing the neural cursor over targets on the screen. Each target directs the activity of one of three independent motors which actuate 1) rotation about the “shoulder” joint; 2) flexion/extension about the “elbow” joint, and 3) opening and closing the “hand”. A piece of candy is placed between the grippers, and MN then closes the hand. The robot arm is then rotated about the shoulder, extended toward the destination, and the candy is released into the operator’s hand. Trial day 209. This was the first and only day that robot arm control was attempted.
Trial Participant #2 performing Center-Out task. The goal of this task is to move the neural cursor to the location of the target (“money bag”). The cursor must dwell over the target for 500ms in order to register a success. Trial day 190. This appeared to be among the best recorded Center-Out performances for this participant.

Partner Institutions

Brown Emory University Harvard Medical School MGH Stanford School of Medicine University of California, Davis DVA