HomePublications

Decoding the dynamic representation of facial expressions of emotion in explicit and incidental tasks

Research output: Contribution to journalArticle

Open Access permissions

Open

Documents

DOI

Authors

Organisational units

Abstract

Faces transmit a wealth of important social signals. While previous studies have elucidated the network of cortical regions important for perception of facial expression, and the associated temporal components such as the P100, N170 and EPN, it is still unclear how task constraints may shape the representation of facial expression (or other face categories) in these networks. In the present experiment, we used Multivariate Pattern Analysis (MVPA) with EEG to investigate the neural information available across time about two important face categories (expression and identity) when those categories are either perceived under explicit (e.g. decoding facial expression category from the EEG when task is on expression) or incidental task contexts (e.g. decoding facial expression category from the EEG when task is on identity). Decoding of both face categories, across both task contexts, peaked in time-windows spanning 91-170ms (across posterior electrodes). Peak decoding of expression, however, was not affected by task context whereas peak decoding of identity was significantly reduced under incidental processing conditions. In addition, errors in EEG decoding correlated with errors in behavioral categorization under explicit processing for both expression and identity, but only with incidental decoding of expression. Furthermore, decoding time-courses and the spatial pattern of informative electrodes showed consistently better decoding of identity under explicit conditions at later-time periods, with weak evidence for similar effects for decoding of expression at isolated time-windows. Taken together, these results reveal differences and commonalities in the processing of face categories under explicit Vs incidental task contexts and suggest that facial expressions are processed to a richer degree under incidental processing conditions, consistent with prior work indicating the relative automaticity by which emotion is processed. Our work further demonstrates the utility in applying multivariate decoding analyses to EEG for revealing the dynamics of face perception.

Details

Original languageEnglish
Pages (from-to)261-271
Number of pages11
JournalNeuroImage
Volume195
Early online date30 Mar 2019
DOIs
Publication statusPublished - 15 Jul 2019
Peer-reviewedYes

Keywords

    Research areas

  • EEG, Multi-variate pattern analysis, Emotion, Vision, Categorization

Downloads statistics

No data available

View graph of relations

ID: 153771028