M1M2: Deep-Learning-Based Real-Time Emotion Recognition from Neural Activity

dc.contributor.authorAkter, Sumyaen
dc.contributor.authorProdhan, Rumman Ahmeden
dc.contributor.authorPias, Tanmoy Sarkaren
dc.contributor.authorEisenberg, Daviden
dc.contributor.authorFresneda Fernandez, Jorgeen
dc.date.accessioned2022-11-10T18:44:15Zen
dc.date.available2022-11-10T18:44:15Zen
dc.date.issued2022-11-03en
dc.date.updated2022-11-10T14:27:54Zen
dc.description.abstractEmotion recognition, or the ability of computers to interpret people’s emotional states, is a very active research area with vast applications to improve people’s lives. However, most image-based emotion recognition techniques are flawed, as humans can intentionally hide their emotions by changing facial expressions. Consequently, brain signals are being used to detect human emotions with improved accuracy, but most proposed systems demonstrate poor performance as EEG signals are difficult to classify using standard machine learning and deep learning techniques. This paper proposes two convolutional neural network (CNN) models (M1: heavily parameterized CNN model and M2: lightly parameterized CNN model) coupled with elegant feature extraction methods for effective recognition. In this study, the most popular EEG benchmark dataset, the DEAP, is utilized with two of its labels, valence, and arousal, for binary classification. We use Fast Fourier Transformation to extract the frequency domain features, convolutional layers for deep features, and complementary features to represent the dataset. The M1 and M2 CNN models achieve nearly perfect accuracy of 99.89% and 99.22%, respectively, which outperform every previous state-of-the-art model. We empirically demonstrate that the M2 model requires only 2 seconds of EEG signal for 99.22% accuracy, and it can achieve over 96% accuracy with only 125 milliseconds of EEG data for valence classification. Moreover, the proposed M2 model achieves 96.8% accuracy on valence using only 10% of the training dataset, demonstrating our proposed system’s effectiveness. Documented implementation codes for every experiment are published for reproducibility.en
dc.description.versionPublished versionen
dc.format.mimetypeapplication/pdfen
dc.identifier.citationAkter, S.; Prodhan, R.A.; Pias, T.S.; Eisenberg, D.; Fresneda Fernandez, J. M1M2: Deep-Learning-Based Real-Time Emotion Recognition from Neural Activity. Sensors 2022, 22, 8467.en
dc.identifier.doihttps://doi.org/10.3390/s22218467en
dc.identifier.urihttp://hdl.handle.net/10919/112565en
dc.language.isoenen
dc.publisherMDPIen
dc.rightsCreative Commons Attribution 4.0 Internationalen
dc.rights.urihttp://creativecommons.org/licenses/by/4.0/en
dc.subjectemotion recognitionen
dc.subjectCNNen
dc.subjectEEGen
dc.subjectmachine learningen
dc.subjectsensoren
dc.subjectdeep learningen
dc.titleM1M2: Deep-Learning-Based Real-Time Emotion Recognition from Neural Activityen
dc.title.serialSensorsen
dc.typeArticle - Refereeden
dc.type.dcmitypeTexten

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
sensors-22-08467.pdf
Size:
6.62 MB
Format:
Adobe Portable Document Format
Description:
Published version
License bundle
Now showing 1 - 1 of 1
Name:
license.txt
Size:
0 B
Format:
Item-specific license agreed upon to submission
Description: