Multi-Domain Feature Fusion for Emotion Classification Using DEAP Dataset

Emotion recognition in real-time using electroencephalography (EEG) signals play a key role in human-computer interaction and affective computing. The existing emotion recognition models, that use stimuli such as music and pictures in controlled lab settings and limited number of emotion classes, ha...

Full description

Bibliographic Details
Main Authors: Muhammad Khateeb, Syed Muhammad Anwar, Majdi Alnowami
Format: Article
Language:English
Published: IEEE 2021-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9321314/
id doaj-fad2b975e61845778ef8ca331c68b150
record_format Article
spelling doaj-fad2b975e61845778ef8ca331c68b1502021-04-05T17:36:56ZengIEEEIEEE Access2169-35362021-01-019121341214210.1109/ACCESS.2021.30512819321314Multi-Domain Feature Fusion for Emotion Classification Using DEAP DatasetMuhammad Khateeb0Syed Muhammad Anwar1https://orcid.org/0000-0002-8179-3959Majdi Alnowami2Department of Software Engineering, University of Engineering and Technology, Taxila, PakistanDepartment of Software Engineering, University of Engineering and Technology, Taxila, PakistanDepartment of Nuclear Engineering, King Abdulaziz University, Jeddah, Saudi ArabiaEmotion recognition in real-time using electroencephalography (EEG) signals play a key role in human-computer interaction and affective computing. The existing emotion recognition models, that use stimuli such as music and pictures in controlled lab settings and limited number of emotion classes, have low ecological validity. Moreover, for effective emotion recognition identifying significant EEG features and electrodes is important. In our proposed model, we use the DEAP dataset consisting of physiological signals collected from 32 participants as they watched 40 movie (each of 60 seconds) clips. The main objective of this study is to explore multi-domain (time, wavelet, and frequency) features and hence, identify the set of stable features which contribute towards emotion classification catering to a larger number of emotion classes. Our proposed model is able to identify nine classes of emotions including happy, pleased, relaxed, excited, neutral, calm, distressed, miserable, and depressed with an average accuracy of 65.92%. Towards this end, we use support vector machine as a classifier along with 10-fold and leave-one-out cross-validation techniques. We achieve a significant emotion classification accuracy which could be vital towards developing solutions for affective computing and deal with a larger number of emotional states.https://ieeexplore.ieee.org/document/9321314/Affective computingelectroencephalographyemotions classificationsfeatures extractionmachine learning
collection DOAJ
language English
format Article
sources DOAJ
author Muhammad Khateeb
Syed Muhammad Anwar
Majdi Alnowami
spellingShingle Muhammad Khateeb
Syed Muhammad Anwar
Majdi Alnowami
Multi-Domain Feature Fusion for Emotion Classification Using DEAP Dataset
IEEE Access
Affective computing
electroencephalography
emotions classifications
features extraction
machine learning
author_facet Muhammad Khateeb
Syed Muhammad Anwar
Majdi Alnowami
author_sort Muhammad Khateeb
title Multi-Domain Feature Fusion for Emotion Classification Using DEAP Dataset
title_short Multi-Domain Feature Fusion for Emotion Classification Using DEAP Dataset
title_full Multi-Domain Feature Fusion for Emotion Classification Using DEAP Dataset
title_fullStr Multi-Domain Feature Fusion for Emotion Classification Using DEAP Dataset
title_full_unstemmed Multi-Domain Feature Fusion for Emotion Classification Using DEAP Dataset
title_sort multi-domain feature fusion for emotion classification using deap dataset
publisher IEEE
series IEEE Access
issn 2169-3536
publishDate 2021-01-01
description Emotion recognition in real-time using electroencephalography (EEG) signals play a key role in human-computer interaction and affective computing. The existing emotion recognition models, that use stimuli such as music and pictures in controlled lab settings and limited number of emotion classes, have low ecological validity. Moreover, for effective emotion recognition identifying significant EEG features and electrodes is important. In our proposed model, we use the DEAP dataset consisting of physiological signals collected from 32 participants as they watched 40 movie (each of 60 seconds) clips. The main objective of this study is to explore multi-domain (time, wavelet, and frequency) features and hence, identify the set of stable features which contribute towards emotion classification catering to a larger number of emotion classes. Our proposed model is able to identify nine classes of emotions including happy, pleased, relaxed, excited, neutral, calm, distressed, miserable, and depressed with an average accuracy of 65.92%. Towards this end, we use support vector machine as a classifier along with 10-fold and leave-one-out cross-validation techniques. We achieve a significant emotion classification accuracy which could be vital towards developing solutions for affective computing and deal with a larger number of emotional states.
topic Affective computing
electroencephalography
emotions classifications
features extraction
machine learning
url https://ieeexplore.ieee.org/document/9321314/
work_keys_str_mv AT muhammadkhateeb multidomainfeaturefusionforemotionclassificationusingdeapdataset
AT syedmuhammadanwar multidomainfeaturefusionforemotionclassificationusingdeapdataset
AT majdialnowami multidomainfeaturefusionforemotionclassificationusingdeapdataset
_version_ 1721539242711056384