and without crossing personal boundaries. Ensuring
transparency and user consent will be vital as
emotional AI becomes more integrated into daily life.
6 CONCLUSIONS
Emotion recognition has become a critical aspect of
human-robot interaction, with profound implications
for how robots understand and respond to human
needs. This review has examined the key
advancements in emotion detection technologies,
highlighting the significance of both traditional
methods and more innovative approaches such as
multimodal integration and deep learning techniques.
The use of multimodal data, including facial
expressions, speech, and physiological signals, has
substantially improved the accuracy and reliability of
emotion recognition systems, enabling robots to
adapt in real time and enhance the naturalness of
human-robot interactions.
Despite the progress made, there are still
significant challenges that need to be addressed. Real-
time processing of complex, multimodal emotional
data remains a technical hurdle, particularly in
environments where quick adaptation is required.
Additionally, understanding and responding to
emotions in a way that is culturally sensitive and
contextually appropriate poses ongoing difficulties.
Addressing these challenges will require not only
advances in algorithms and hardware but also a
deeper exploration of the psychological and social
dimensions of emotion in human-robot interaction.
Looking forward, the future of emotional
intelligent robots lies in further refining these
technologies to create more seamless, empathetic
interactions. This includes the development of more
adaptive and personalized emotion recognition
models that can operate across diverse environments
and user groups. Moreover, ethical considerations
must be an integral part of future research, ensuring
that the deployment of emotion-aware robots
promotes positive human experiences without
infringing on privacy or autonomy.
REFERENCES
Cavallo, F., Semeraro, F., Fiorini, L., Magyar, G., Sinčák,
P., & Dario, P. (2018). Emotion modelling for Social
Robotics Applications: A Review. Journal of Bionic
Engineering, 15(2), 185–203.
https://doi.org/10.1007/s42235-018-0015-y
Chiang, A.-H., Trimi, S., & Lo, Y.-J. (2022). Emotion and
service quality of anthropomorphic robots.
Technological Forecasting and Social Change, 177,
121550.
https://doi.org/10.1016/j.techfore.2022.121550
Chuah, S. H.-W., & Yu, J. (2021). The future of service:
The power of emotion in human-robot interaction.
Journal of Retailing and Consumer Services, 61,
102551.
https://doi.org/10.1016/j.jretconser.2021.102551
Cui, H., Liu, A., Zhang, X., Chen, X., Wang, K., & Chen,
X. (2020). EEG-based emotion recognition using an
end-to-end regional-asymmetric convolutional neural
network. Knowledge-Based Systems, 205, 106243.
https://doi.org/10.1016/j.knosys.2020.106243
Heredia, J., Lopes-Silva, E., Cardinale, Y., Diaz-Amado, J.,
Dongo, I., Graterol, W., & Aguilera, A. (2022).
Adaptive Multimodal Emotion Detection Architecture
for Social Robots. IEEE Access, 10, 20727–20744.
https://doi.org/10.1109/access.2022.3149214
Hong, A., Lunscher, N., Hu, T., Tsuboi, Y., Zhang, X.,
Franco dos Reis Alves, S., Nejat, G., & Benhabib, B.
(2021). A multimodal emotional human–robot
interaction architecture for social robots engaged in
bidirectional communication. IEEE Transactions on
Cybernetics, 51(12), 5954–5968.
https://doi.org/10.1109/tcyb.2020.2974688
Li, Y., Kazemeini, A., Mehta, Y., & Cambria, E. (2022).
Multitask learning for emotion and personality traits
detection. Neurocomputing, 493, 340–350.
https://doi.org/10.1016/j.neucom.2022.04.049
Li, Y.-K., Meng, Q.-H., Wang, Y.-X., & Hou, H.-R. (2023).
MMFN: Emotion recognition by fusing touch gesture
and facial expression information. Expert Systems with
Applications, 228, 120469.
https://doi.org/10.1016/j.eswa.2023.120469
Marcos-Pablos, S., & García-Peñalvo, F. J. (2021).
Emotional intelligence in robotics: A scoping review.
Advances in Intelligent Systems and Computing, 66–75.
https://doi.org/10.1007/978-3-030-87687-6_7
Park, S., & Whang, M. (2022). Empathy in human–robot
interaction: Designing for social robots. International
Journal of Environmental Research and Public Health,
19(3), 1889. https://doi.org/10.3390/ijerph19031889
Rawal, N., & Stock-Homburg, R. M. (2022). Facial
emotion expressions in human–robot interaction: A
survey. International Journal of Social Robotics, 14(7),
1583–1604. https://doi.org/10.1007/s12369-022-
00867-0
Shen, F., Dai, G., Lin, G., Zhang, J., Kong, W., & Zeng, H.
(2020). EEG-based emotion recognition using 4D
convolutional recurrent neural network. Cognitive
Neurodynamics, 14(6), 815–828.
https://doi.org/10.1007/s11571-020-09634-1
Stark, L., & Hoey, J. (2021). The ethics of Emotion in
Artificial Intelligence Systems. Proceedings of the
2021 ACM Conference on Fairness, Accountability,
and Transparency.
https://doi.org/10.1145/3442188.3445939