Data-driven analysis of facial thermal responses and multimodal physiological consistency among subjects

Abstract Facial infra-red imaging (IRI) is a contact-free technique complimenting the traditional psychophysiological measures to characterize physiological profile. However, its full potential in affective research is arguably unmet due to the analytical challenges it poses. Here we acquired facial...

Full description

Bibliographic Details
Main Authors: Saurabh Sonkusare, Michael Breakspear, Tianji Pang, Vinh Thai Nguyen, Sascha Frydman, Christine Cong Guo, Matthew J. Aburn
Format: Article
Language:English
Published: Nature Publishing Group 2021-06-01
Series:Scientific Reports
Online Access:https://doi.org/10.1038/s41598-021-91578-5
Description
Summary:Abstract Facial infra-red imaging (IRI) is a contact-free technique complimenting the traditional psychophysiological measures to characterize physiological profile. However, its full potential in affective research is arguably unmet due to the analytical challenges it poses. Here we acquired facial IRI data, facial expressions and traditional physiological recordings (heart rate and skin conductance) from healthy human subjects whilst they viewed a 20-min-long unedited emotional movie. We present a novel application of motion correction and the results of spatial independent component analysis of the thermal data. Three distinct spatial components are recovered associated with the nose, the cheeks and respiration. We first benchmark this methodology against a traditional nose-tip region-of-interest based technique showing an expected similarity of signals extracted by these methods. We then show significant correlation of all the physiological responses across subjects, including the thermal signals, suggesting common dynamic shifts in emotional state induced by the movie. In sum, this study introduces an innovative approach to analyse facial IRI data and highlights the potential of thermal imaging to robustly capture emotion-related changes induced by ecological stimuli.
ISSN:2045-2322