to allow more precise emotion classification from
EEG signals.
Traditional machine learning models like SVM
and Random Forest have been the basis for
classification using EEG but have been limited in
their capabilities in extracting intricate temporal
patterns, which have given birth to the use of deep
models. LSTMs, Transformer models, as well as
CNNs, have been found to perform much better in
extracting significant features as well as classification
accuracy. But these models use large databases as
well as require large computational power, which
makes them non-practicable in the context of real-
time.
Despite advancements, there continue to be
significant challenges with the use of EEG-based
emotional decoding due to the low signal-to-noise
ratio, the large inter-subject variance, as well as the
non-availability of a standard EEG dataset. Cross-
dataset adaptation techniques, multimodal fusion, as
well as the implementation of hybrid deep networks,
can be utilized in countering these challenges.
Future research would be directed toward real-time
BCI implementation, wherein the light models of AI,
being highly efficient, can be utilized with portable
and wearable EEG devices. Also, incorporating
multimodal paradigms through the combination of
EEG with facial expressions, voice, and body signals
like GSR can be used to boost the accuracy of
recognizing emotions. Domain adaptation and
transfer learning will play a crucial role in creating
models that generalize well across different EEG
datasets and recording conditions.
In summary, though EEG-based emotion
recognition is progressing, making it practical to
deploy in real-world scenarios is still a persisting
challenge. Standardized datasets, refined deep
learning models, and real-time inference
optimizations are the key to moving the field forward
and enabling EEG-based emotion decoding as a
practical approach for affective computing and BCI
applications.
REFERENCES
Abibullaev, B., Keutayeva, A., & Zollanvari, A. 2023.
Deep learning in EEG-based BCIs: A comprehensive
review of Transformer models, advantages, challenges,
and applications. IEEE Access 11:127271–127301.
Dadebayev, D., Goh, W. W., & Tan, E. X. 2022. EEG-
based emotion recognition: Review of commercial
EEG devices and machine learning techniques. Journal
of King Saud University - Computer and Information
Sciences 34(7):4385–4401.
Harmon-Jones, E., Harmon-Jones, C., & Summerell, E.
2017. On the importance of both dimensional and
discrete models of emotion. Behavioral Sciences
7(4):Article 66.
Hassouneh, A., Mutawa, A. M., & Murugappan, M. 2020.
Development of a real-time emotion recognition system
using facial expressions and EEG based on machine
learning and deep neural network methods. Informatics
in Medicine Unlocked 20:100372.
Jafari, M., Shoeibi, A., Khodatars, M., Bagherzadeh, S.,
Shalbaf, A., López García, D., Gorriz, J. M., &
Acharya, U. R. 2023. Emotion recognition in EEG
signals using deep learning methods: A review.
Computers in Biology and Medicine 165:107450.
Lench, H. C., Flores, S. A., & Bench, S. W. 2011. Discrete
emotions predict changes in cognition, judgment,
experience, behavior, and physiology: A meta-analysis
of experimental emotion elicitations. Psychological
Bulletin 137(5):834–855.
Liu, Y., Sourina, O., & Nguyen, M. K. 2011. Real-time
EEG-based emotion recognition and its applications. In
M. L. Gavrilova, C. J. K. Tan, A. Sourin, & O. Sourina
(Eds.), Transactions on Computational Science XII:
Special Issue on Cyberworlds. Lecture Notes in
Computer Science, vol 6670. Springer, Berlin,
Heidelberg, 256–277.
Luo, Y., Fu, Q., Xie, J., Qin, Y., Wu, G., Liu, J., Jiang, F.,
Cao, Y., & Ding, X. 2020. EEG-based emotion
classification using spiking neural networks. IEEE
Access 8:46007–46016.
Rahman, Md. M., Sarkar, A. K., Hossain, Md. A., Hossain,
Md. S., Islam, Md. R., Hossain, Md. B., Quinn, J. M.
W., & Moni, M. A. 2021. Recognition of human
emotions using EEG signals: A review. Computers in
Biology and Medicine 136:104696.
Scherer, K. R. 2001. Emotions, psychological structure of.
In N. J. Smelser & P. B. Baltes (Eds.), International
Encyclopedia of the Social & Behavioral Sciences.
Pergamon, 4472–4477.
Val-Calvo, M., Álvarez-Sánchez, J. R., Ferrández-Vicente,
J. M., Díaz-Morcillo, A., & Fernández-Jover, E. 2020.
Real-time multi-modal estimation of dynamically
evoked emotions using EEG, heart rate and galvanic
skin response. International Journal of Neural Systems
30(4):2050013.
Wang, J., & Wang, M. 2021. Review of the emotional
feature extraction and classification using EEG signals.
Cognitive Robotics 1:29–40.
Wang, Q., Wang, M., Yang, Y., & Zhang, X. 2022. Multi-
modal emotion recognition using EEG and speech
signals. Computers in Biology and Medicine
149:105907.
Yang, P., Liu, N., Liu, X., Shu, Y., Ji, W., Ren, Z., Sheng,
J., Yu, M., Yi, R., Zhang, D., & Liu, Y.-J. 2024. A
multimodal dataset for mixed emotion recognition.
Scientific Data 11(1):847.