A Brain-Machine Interface Enables Bimanual Arm Movements in Monkeys.pdf

download A Brain-Machine Interface Enables Bimanual Arm Movements in Monkeys.pdf

of 13

Transcript of A Brain-Machine Interface Enables Bimanual Arm Movements in Monkeys.pdf

  • 8/14/2019 A Brain-Machine Interface Enables Bimanual Arm Movements in Monkeys.pdf

    1/13

    B R A I N - M A C H I N E I N T E R F A C E S

    A Brain-Machine Interface Enables Bimanual ArmMovements in Monkeys

    Peter J. Ifft,1,2 Solaiman Shokur,2,3 Zheng Li,2,4 Mikhail A. Lebedev,2,4 Miguel A. L. Nicolelis1,2,4,5,6*

    Brain-machine interfaces (BMIs) are artificial systems that aim to restore sensation and movement to paralyzed

    patients. So far, BMIs have enabled only one arm to be moved at a time. Control of bimanual arm movements

    remains a major challenge. We have developed and tested a bimanual BMI that enables rhesus monkeys to control

    two avatar arms simultaneously. The bimanual BMI was based on the extracellular activity of 374 to 497 neurons re-

    corded from several frontal and parietal cortical areas of both cerebral hemispheres. Cortical activity was transformed

    into movements of the two arms with a decoding algorithm called a fifth-order unscented Kalman filter (UKF). The

    UKF was trained either during a manual task performed with two joysticks or by having the monkeys passively ob-

    serve the movements of avatar arms. Most cortical neurons changed their modulation patterns when both arms were

    engaged simultaneously. Representing the two arms jointly in a single UKF decoder resulted in improved decoding

    performance compared with using separate decoders for each arm. As the animalsperformance in bimanual BMI

    control improved over time, we observed widespread plasticity in frontal and parietal cortical areas. Neuronal rep-

    resentation of the avatar and reach targets was enhanced with learning, whereas pairwise correlations between

    neurons initially increased and then decreased. These results suggest that cortical networks may assimilate the

    two avatar arms through BMI control. These findings should help in the design of more sophisticated BMIs capableof enabling bimanual motor control in human patients.

    INTRODUCTION

    The complexity and variety of motor behaviors in humans and otherprimates is greatly augmented by the remarkable ability of the centralnervous system to control bimanual movements (1). Yet, this func-tionality has not been enacted by previous brain-machine interfaces(BMIs). BMIs are hybrid systems that directly connect brain tissueto machines to restore motor and sensory functions to paralyzed in-dividuals (2). The advancement of BMIs has been driven by two fun-damental goals: investigation of the physiological principles that guide

    the operation of large neural ensembles (3) and the development ofneuroprosthetic devices that could restore limb movements and sen-sation to paralyzed patients (4,5). Previous BMIs mimicked only single-arm control and were represented by either a computer cursor (611),a robot (8,12,13), or an avatar arm (14), but did not enable simulta-neous bimanual control of arm movements.

    Studies in nonhuman primates have shown that the brain does notencode bimanual movements simply by superimposing two indepen-dent single-limb representations (15,16). Cortical regions, such as thesupplementary motor area (SMA) (1719) and the primary motorcortex (M1) (16,18,19), exhibit specific patterns of activity during bi-manual movements. These complex neuronal representations pose amajor challenge for BMIs enabling bimanual control because theycannot be designed simply by combining single-limb modules. Here,we tested the ability of a bimanual BMI to enable rhesus monkeys tocontrol two avatar arms simultaneously.

    RESULTS

    Large-scale recordings and experimental paradigmsWe set out to discover whether large-scale cortical recordings couprovide sufficient neuronal signals to accurately control a bimanuBMI (4,20). We implanted volumetric multielectrode arrays in twmonkeys (768 microelectrodes in monkey C; 384 microelectrodes monkey M) (Fig. 1A) as described previously (20). Neural signals wsorted using template matching algorithms within commercially ava

    able software (Plexon Inc.). In monkey C, we simultaneously sampl(Fig. 1,C, E,andF) fromthe SMA (73 to110unitsin the lefthemisphe0 to 20 units in the right hemisphere; ranges for all experiments), M(176 to 218 units in the left hemisphere, 45 to 62 units in the right heisphere), primary somatosensory cortex (S1) (9 to 64 units in the lhemisphere, 0 to 34 units in the right hemisphere), and posteriparietal cortex (PPC) (0 to 4 units in the left hemisphere, 22 to 47 unin the right hemisphere). In monkey M, we sampled from M1 (8090 unitsin theleft hemisphere, 195 to 204units in theright hemispheand S1(47 to56 units inthe lefthemisphere, 127 to149units inthe righemisphere). The daily unit count neared 500 for each monkey, whiconstitutes the highest number of simultaneously recorded unitsnonhuman primates to date (21). The high unit count for monkeypersisted for 48 months after the implantation surgery, and for monkC persisted for at least 18 months after the surgery (recordings are scontinuing in these two animals).

    Using this large-scale BMI, both monkeys were able to directly cotrol the simultaneous reaching movements performed by two avaarms. Moreover, these monkeys learned to operate the bimanual BMwithout producing overt movements of their own arms. A realistic, vtual monkey avatar (fig. S1A) was chosen as the actuator for the bmanual BMI because in previous studies (14, 22) and experiments this study (Fig. 2), we observed that monkeys readily engaged in boactive control and passive observation of the avatar movements. Ea

    1Department of Biomedical Engineering, Duke University, Durham, NC 27708, USA.2Center for Neuroengineering, Duke University, Durham, NC 27708, USA. 3Laboratoirede Systmes Robotiques, cole Polytechnique Fdrale de Lausanne, CH-1015Lausanne, Switzerland. 4Department of Neurobiology, Duke University, Durham, NC27708, USA. 5Department of Psychology and Neuroscience, Duke University, Durham,NC 27708, USA. 6Edmond and Lily Safra International Institute of Neuroscience ofNatal, Natal Rio Grande do Norte, 59055-060, Brazil.*Corresponding author. E-mail: [email protected]

    R E S E A R C H A R T I C L E

    www.ScienceTranslationalMedicine.org 6 November 2013 Vol 5 Issue 210 210ra154

  • 8/14/2019 A Brain-Machine Interface Enables Bimanual Arm Movements in Monkeys.pdf

    2/13

    monkey observed two avatar arms on a computer monitor from a first-person perspective (Fig. 1B). A trial began with the appearance on thescreen of two square targets. Their position was the same in all trials,and they served as the start positions for the avatar hands. The mon-key had to place the avatar hands over their respective targets and holdthese positions for a delay, randomly drawn from a uniform distribu-tion (400- to 1000-ms intervals, Fig. 1D). The two squares were then

    replaced by two circular targets in 1 of 16 possible configurations (right,left, up, or down relative to start position for each hand). At this point,

    the monkey had to place both avatar hands over the targets indicatedthe two circles and hold the targets simultaneously for a minimum100 ms to receive a fruit juice reward. In the unimanual version of thtask, a single avatar arm had to reach for a single target.

    The tasks were performed in three possible ways: joystick cotrol, brain control with arm movements (BC with arms), and bracontrol without arm movements (BC without arms). Both monke

    learned to perform BC without arms, but through different learing sequences. Monkey C began with joystick control, during whi

    the right and left avatar arms were cotrolled directly by movements of the tw

    joysticks (Fig. 1F) (18). Monkey C thlearned BC with arms, during which movments of the avatar arms were controldirectly by cortical activity, although tmonkey was permitted to continue mnipulating the joysticks. Finally, monkC learned BC without arms, a modeoperation where decoded brain activonce again controlled avatar arm movments, but now overt limb movemenwere prevented by gently restraining boarms. Monkey M did not use the joystin any task. Rather, this monkeys tatraining began by having it passively oserve the avatar arms moving on the screas an initial step before learning BC witout arms. This type of BMI training hclinical relevance for paralyzed subjewho cannot produce any overt movemenand it has been used in several humstudies (13, 23).

    To set up BC with arms for monkC, we followed our previously establish

    routine (8, 10) of training the BMI dcoder on joystick control data to extraarm kinematics from cortical activiDaily sessions dedicated solely to joystcontrol lasted 20 to 40 min. Brain contsessions began with 5 to 7 min of the jostick control task, before switching to Bwith arms for the final 20 to 40 min. Dspite more complexities regarding indpendent control of two virtual limbs, tdecoding accuracy for our bimanual BMwas sufficient for online control (moS3) and matched the accuracy previoureported for less challenging unimanuBMIs (7, 8, 10, 24, 25).

    Bimanual joystick controlMonkey C was trained to perform bounimanual and bimanual joystick conttasks very accurately (greater than 97%the trials were correct) (fig. S1, B to E, amovies S1 and S2). Cortical recordincollected from this monkey revealed widspread neuronal modulations that reflec

    Neural

    activity

    Signal

    processing

    Signal

    amplification

    Extraction of

    kinematics from

    brain activity

    Joystick

    control

    Brain

    control

    SMA

    PMd

    M1

    S1

    PPC

    Center hold

    Target

    appearanceTarget

    entry

    BimanualUnimanual (right hand)

    M1

    S1

    Time (s)0 05 Time (s) 5

    XL

    XR

    YL

    YR

    0 5 0 5

    Control of

    virtual

    limbs

    A

    C

    B

    D

    E

    L SMA

    L M1

    R M1

    L M1

    L S1R SMA

    L SMAR PPC

    XL

    XR

    YL

    YR

    Monkey C Monkey M

    F

    Fig. 1. Large-scale electrode implants and behavioral tasks. (A) Monkey C (left) and monkey M(right) were chronically implanted with eight and four 96-channel arrays, respectively. ( B) The monkeyis seated in front of a screen showing two virtual arms and uses either joystick movements or mod-ulations in neural activity to control the avatar arms. (C) Four hundred forty-one sample waveformsfrom typical monkey C recording sessions, with the color of the waveform indicating the recordingsite [shown in (A)]. (D) Left to right: Trial sequence began with both hands holding a center target fora random interval. Next, two peripheral targets appeared, which had to be reached to and held withthe respective hands to receive a juice reward. (EandF) Raster plot of spike events from 438 neurons(yaxis) over time (xaxis) for a single unimanual (E) and bimanual (F) trial. Target location and positiontraces of trial are indicated to the right of the raster panel.

    R E S E A R C H A R T I C L E

    www.ScienceTranslationalMedicine.org 6 November 2013 Vol 5 Issue 210 210ra154

  • 8/14/2019 A Brain-Machine Interface Enables Bimanual Arm Movements in Monkeys.pdf

    3/13

    movement timing and direction (Figs. 1, E and F, and 3 to 5). Con-sistent with previous studies (15,16,18), cortical activity from multipleareas was different between unimanual and bimanual movements (Figs.3 and 4). In motor areas M1 (Fig. 3, A, B, E, and F) and SMA (Fig. 3, C,D, G, and H), individual units (Fig. 3, A, C, E, and G) and neuronalpopulations (Fig. 3, B, D, F, and H) alike exhibited directionally selec-tive modulations during both unimanual and bimanual performance.

    For each configuration of the pair of targets, we characterized neuro-nal modulations as Dzthe difference between the movement epoch(from 150 to 600 ms after target appearance) firing rate and baselinerate, both expressed in normalized units (zscores). Normalization tozscores was applied to each units firing rate before any grouping oraveraging of individual trials. Average modulation for all target posi-tions was quantified as absolute value ofDzaveraged for all target po-sitions (|Dz|). Directional selectivity was measured as the SD ofDzfordifferent target positions, s(Dz).

    The transition from unimanual to bimanual movements (table S1)induced several effects in M1 and SMA. First, we observed a promi-nent increase in |Dz| during bimanual movements by 76.7 and 34.6%for left M1 and right M1, respectively, and 35.8 and 37.9% for left andright SMA (P< 0.01,ttest). M1 neurons exhibited clear preference forthe contralateral rather than the ipsilateral arm during unimanual per-formance, both in terms of overall modulations (28.3% increase in |Dz|for contralateral versus ipsilateral arm; P< 0.01) and in terms of tuningdepth [22.3% increase ins(Dz)]. An opposite, ipsilateral preference wasobserved for SMA [19.1% decrease in |Dz| and 11.1% decrease in s(Dz);P< 0.01]. For both M1 and SMA, directional tuning depth during thebimanual task was about equal for the left and right arm [lefts(Dz): 0.08;rights(Dz): 0.09;P> 0.01]. Notably, SMA was the only area where moreneurons were tuned to both arms after a transition from unimanual tobimanual movements (P< 0.01) (Fig. 4A). In addition to changes inoverall modulations and directional tuning depth, bimanual control

    *10

    20

    30

    40

    50

    10

    20

    30

    40

    50

    %

    Trials

    Session number 10 10Session number

    Targets Targets

    Actuators Actuators

    With left arm movements

    A B

    With right arm movementsDC

    Avatar

    Cursor

    Fig. 2. Comparison of bimanual behavioral training with cursor andavatar actuators. (A and B) Two 2D cursors (A) or two avatar arms (B) werecontrolled by joystick movements. In both environments, the target foreach hand was a white circle. (C and D) Percentage of total trials containinga threshold amount of movements (avatar arm reached beyond 80% ofdistance from center to target), with the left arm (C) or right arm (D) shownin lower panels. The first 10 sessions of avatar and cursor bimanual training,conducted on alternate days, are shown separately in blue and red. *P