Multi-Domain Feature Fusion for Emotion Classification Using DEAP Dataset

Emotion recognition in real-time using electroencephalography (EEG) signals play a key role in human-computer interaction and affective computing. The existing emotion recognition models, that use stimuli such as music and pictures in controlled lab settings and limited number of emotion classes, ha...

Full description

Bibliographic Details
Main Authors: Muhammad Khateeb, Syed Muhammad Anwar, Majdi Alnowami
Format: Article
Language:English
Published: IEEE 2021-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9321314/
Description
Summary:Emotion recognition in real-time using electroencephalography (EEG) signals play a key role in human-computer interaction and affective computing. The existing emotion recognition models, that use stimuli such as music and pictures in controlled lab settings and limited number of emotion classes, have low ecological validity. Moreover, for effective emotion recognition identifying significant EEG features and electrodes is important. In our proposed model, we use the DEAP dataset consisting of physiological signals collected from 32 participants as they watched 40 movie (each of 60 seconds) clips. The main objective of this study is to explore multi-domain (time, wavelet, and frequency) features and hence, identify the set of stable features which contribute towards emotion classification catering to a larger number of emotion classes. Our proposed model is able to identify nine classes of emotions including happy, pleased, relaxed, excited, neutral, calm, distressed, miserable, and depressed with an average accuracy of 65.92%. Towards this end, we use support vector machine as a classifier along with 10-fold and leave-one-out cross-validation techniques. We achieve a significant emotion classification accuracy which could be vital towards developing solutions for affective computing and deal with a larger number of emotional states.
ISSN:2169-3536