The image features of emotional faces that predict the initial eye movement to a face

Abstract Emotional facial expressions are important visual communication signals that indicate a sender’s intent and emotional state to an observer. As such, it is not surprising that reactions to different expressions are thought to be automatic and independent of awareness. What is surprising, is...

Full description

Bibliographic Details
Main Authors: S. M. Stuit, T. M. Kootstra, D. Terburg, C. van den Boomen, M. J. van der Smagt, J. L. Kenemans, S. Van der Stigchel
Format: Article
Language:English
Published: Nature Publishing Group 2021-04-01
Series:Scientific Reports
Online Access:https://doi.org/10.1038/s41598-021-87881-w
Description
Summary:Abstract Emotional facial expressions are important visual communication signals that indicate a sender’s intent and emotional state to an observer. As such, it is not surprising that reactions to different expressions are thought to be automatic and independent of awareness. What is surprising, is that studies show inconsistent results concerning such automatic reactions, particularly when using different face stimuli. We argue that automatic reactions to facial expressions can be better explained, and better understood, in terms of quantitative descriptions of their low-level image features rather than in terms of the emotional content (e.g. angry) of the expressions. Here, we focused on overall spatial frequency (SF) and localized Histograms of Oriented Gradients (HOG) features. We used machine learning classification to reveal the SF and HOG features that are sufficient for classification of the initial eye movement towards one out of two simultaneously presented faces. Interestingly, the identified features serve as better predictors than the emotional content of the expressions. We therefore propose that our modelling approach can further specify which visual features drive these and other behavioural effects related to emotional expressions, which can help solve the inconsistencies found in this line of research.
ISSN:2045-2322