Behavioral validation of novel high resolution attention decoding method from multi-units & local field potentials

The ability to access brain information in real-time is crucial both for a better understanding of cognitive functions and for the development of therapeutic applications based on brain-machine interfaces. Great success has been achieved in the field of neural motor prosthesis. Progress is still nee...

Full description

Bibliographic Details
Main Authors: Carine De Sousa, C. Gaillard, F. Di Bello, S. Ben Hadj Hassen, S. Ben Hamed
Format: Article
Language:English
Published: Elsevier 2021-05-01
Series:NeuroImage
Subjects:
LFP
Online Access:http://www.sciencedirect.com/science/article/pii/S1053811921001300
id doaj-c5ec7ec6ebbb42b19bc5c82e97f85c21
record_format Article
spelling doaj-c5ec7ec6ebbb42b19bc5c82e97f85c212021-05-22T04:35:40ZengElsevierNeuroImage1095-95722021-05-01231117853Behavioral validation of novel high resolution attention decoding method from multi-units & local field potentialsCarine De Sousa0C. Gaillard1F. Di Bello2S. Ben Hadj Hassen3S. Ben Hamed4Corresponding authors.; Institut des Sciences Cognitives Marc Jeannerod, CNRS UMR 5229, Université Claude Bernard Lyon I, 67 Boulevard Pinel, 69675 Bron Cedex, FranceInstitut des Sciences Cognitives Marc Jeannerod, CNRS UMR 5229, Université Claude Bernard Lyon I, 67 Boulevard Pinel, 69675 Bron Cedex, FranceInstitut des Sciences Cognitives Marc Jeannerod, CNRS UMR 5229, Université Claude Bernard Lyon I, 67 Boulevard Pinel, 69675 Bron Cedex, FranceInstitut des Sciences Cognitives Marc Jeannerod, CNRS UMR 5229, Université Claude Bernard Lyon I, 67 Boulevard Pinel, 69675 Bron Cedex, FranceCorresponding authors.; Institut des Sciences Cognitives Marc Jeannerod, CNRS UMR 5229, Université Claude Bernard Lyon I, 67 Boulevard Pinel, 69675 Bron Cedex, FranceThe ability to access brain information in real-time is crucial both for a better understanding of cognitive functions and for the development of therapeutic applications based on brain-machine interfaces. Great success has been achieved in the field of neural motor prosthesis. Progress is still needed in the real-time decoding of higher-order cognitive processes such as covert attention. Recently, we showed that we can track the location of the attentional spotlight using classification methods applied to prefrontal multi-unit activity (MUA) in the non-human primates. Importantly, we demonstrated that the decoded (x,y) attentional spotlight parametrically correlates with the behavior of the monkeys thus validating our decoding of attention. We also demonstrate that this spotlight is extremely dynamic. Here, in order to get closer to non-invasive decoding applications, we extend our previous work to local field potential signals (LFP). Specifically, we achieve, for the first time, high decoding accuracy of the (x,y) location of the attentional spotlight from prefrontal LFP signals, to a degree comparable to that achieved from MUA signals, and we show that this LFP content is predictive of behavior. This LFP attention-related information is maximal in the gamma band (30–250 Hz), peaking between 60 to 120 Hz. In addition, we introduce a novel two-step decoding procedure based on the labelling of maximally attention-informative trials during the decoding procedure. This procedure strongly improves the correlation between our real-time MUA and LFP based decoding and behavioral performance, thus further refining the functional relevance of this real-time decoding of the (x,y) locus of attention. This improvement is more marked for LFP signals than for MUA signals. Overall, this study demonstrates that the attentional spotlight can be accessed from LFP frequency content, in real-time, and can be used to drive high-information content cognitive brain-machine interfaces for the development of new therapeutic strategies.http://www.sciencedirect.com/science/article/pii/S1053811921001300MonkeyPrefrontal cortexAttentionLFPMachine learningDecoding
collection DOAJ
language English
format Article
sources DOAJ
author Carine De Sousa
C. Gaillard
F. Di Bello
S. Ben Hadj Hassen
S. Ben Hamed
spellingShingle Carine De Sousa
C. Gaillard
F. Di Bello
S. Ben Hadj Hassen
S. Ben Hamed
Behavioral validation of novel high resolution attention decoding method from multi-units & local field potentials
NeuroImage
Monkey
Prefrontal cortex
Attention
LFP
Machine learning
Decoding
author_facet Carine De Sousa
C. Gaillard
F. Di Bello
S. Ben Hadj Hassen
S. Ben Hamed
author_sort Carine De Sousa
title Behavioral validation of novel high resolution attention decoding method from multi-units & local field potentials
title_short Behavioral validation of novel high resolution attention decoding method from multi-units & local field potentials
title_full Behavioral validation of novel high resolution attention decoding method from multi-units & local field potentials
title_fullStr Behavioral validation of novel high resolution attention decoding method from multi-units & local field potentials
title_full_unstemmed Behavioral validation of novel high resolution attention decoding method from multi-units & local field potentials
title_sort behavioral validation of novel high resolution attention decoding method from multi-units & local field potentials
publisher Elsevier
series NeuroImage
issn 1095-9572
publishDate 2021-05-01
description The ability to access brain information in real-time is crucial both for a better understanding of cognitive functions and for the development of therapeutic applications based on brain-machine interfaces. Great success has been achieved in the field of neural motor prosthesis. Progress is still needed in the real-time decoding of higher-order cognitive processes such as covert attention. Recently, we showed that we can track the location of the attentional spotlight using classification methods applied to prefrontal multi-unit activity (MUA) in the non-human primates. Importantly, we demonstrated that the decoded (x,y) attentional spotlight parametrically correlates with the behavior of the monkeys thus validating our decoding of attention. We also demonstrate that this spotlight is extremely dynamic. Here, in order to get closer to non-invasive decoding applications, we extend our previous work to local field potential signals (LFP). Specifically, we achieve, for the first time, high decoding accuracy of the (x,y) location of the attentional spotlight from prefrontal LFP signals, to a degree comparable to that achieved from MUA signals, and we show that this LFP content is predictive of behavior. This LFP attention-related information is maximal in the gamma band (30–250 Hz), peaking between 60 to 120 Hz. In addition, we introduce a novel two-step decoding procedure based on the labelling of maximally attention-informative trials during the decoding procedure. This procedure strongly improves the correlation between our real-time MUA and LFP based decoding and behavioral performance, thus further refining the functional relevance of this real-time decoding of the (x,y) locus of attention. This improvement is more marked for LFP signals than for MUA signals. Overall, this study demonstrates that the attentional spotlight can be accessed from LFP frequency content, in real-time, and can be used to drive high-information content cognitive brain-machine interfaces for the development of new therapeutic strategies.
topic Monkey
Prefrontal cortex
Attention
LFP
Machine learning
Decoding
url http://www.sciencedirect.com/science/article/pii/S1053811921001300
work_keys_str_mv AT carinedesousa behavioralvalidationofnovelhighresolutionattentiondecodingmethodfrommultiunitsamplocalfieldpotentials
AT cgaillard behavioralvalidationofnovelhighresolutionattentiondecodingmethodfrommultiunitsamplocalfieldpotentials
AT fdibello behavioralvalidationofnovelhighresolutionattentiondecodingmethodfrommultiunitsamplocalfieldpotentials
AT sbenhadjhassen behavioralvalidationofnovelhighresolutionattentiondecodingmethodfrommultiunitsamplocalfieldpotentials
AT sbenhamed behavioralvalidationofnovelhighresolutionattentiondecodingmethodfrommultiunitsamplocalfieldpotentials
_version_ 1721430917393678336