ht 87.6% 112 ms
With Sunglasses 79.4% 97 ms
5.2 User Experience Findings
• 78% reported improved mood regulation.
• 63% preferred system over static playlists.
• Average session duration increased by 22
minutes.
6 CONCLUSIONS
The impact of our emotion-aware music system on
human-computer interaction is significant,
combining advanced AI techniques with music - a
universal language. Real time emotion detection and
responding to emotion of user by carefully curated
playlists is also what makes the system a truly
tailored listening experience that adapts to users
mood. Positive participant feedback-reporting
feelings of being understood and engaged-confirms
that when designed with empathy and precision,
technology can enhance emotional well-being. This
project shows how machine learning can be more
than cold calculations, it can be warm, human
experiences.
Excitingly, these applications are only the tip of
the iceberg. Therapeutic clinics to smart homes: The
potential applications of emotion-aware systems
could transform our relationships with the technology
that surrounds us every day. Aspects, such as
inconsistency in lighting and computational burden
still exists this paper helps to build a foundation
through to an exhilarating future where tech does not
only serve us, but where it actually comprehends us.
As the system continues to improve, our vision
remains simple: AI that plays music for you, the right
music at the right time, making each listener feel
acknowledged, valued, and heard.
Looking ahead, the goal is to make our real-time
emotion detection system more intuitive, in fact,
we’ll integrate facial cues with voice tones for
enhanced accuracy, such as identifying sarcasm or
suppressed emotions. And since we optimize the
model for edge devices, we can process the data in a
much faster way and more privacy-aware, as we do
not send the data to the cloud. We are also placing a
strong emphasis on inclusivity being trained on a rich
set of data from around the world, in order to lessen
cultural biases in emotion interpretation. Interactive
features, such as real-time feedback while on video
calls, will help users improve their emotional
expressiveness. Finally, we’re exploring AR overlays
to gamify therapy sessions and adding adaptive
learning so the AI evolves with users’ unique
expressions over time.
REFERENCES
Bradski, G. (2022). OpenCV 5 Real-Time Performance
Optimization. O'Reilly Media. ISBN:978-1-4920-
8129-3
Chen, J., et al. (2021). Edge Deployment of CNN-Based
Emotion Recognition. ACM Transactions on
Embedded Computing Systems.
[DOI:10.1145/3446921]
Farzaneh, A. H., & Qi, X. (2021). Facial Expression
Recognition in the Wild Using Multi-Task Learning.
WACV. [DOI:10.1109/WACV48630.2021.00215]
Jain, N., et al. (2022). Real-Time Emotion Recognition
Using Lightweight CNNs with OpenCV Optimization.
IEEE Access. [DOI:10.1109/ACCESS.2022.3145998]
Kumar, P., et al. (2023). Quantized CNN Models for Real-
Time Emotion Detection. Neural Computing and
Applications. [DOI:10.1007/s00521-023-08421-3]
Lee, S., et al. (2023). Emotion-Aware VR Environments
Using Real-Time FER. IEEE Virtual Reality.
[DOI:10.1109/VRW55335.2023.00102]
Li, S., & Deng, W. (2020). Deep Facial Expression
Recognition: A Survey. IEEE Transactions on
Affective Computing.
[DOI:10.1109/TAFFC.2020.2981446]
Martinez, B., & Valstar, M. F. (2021). Therapeutic
Applications of Real-Time Emotion Recognition. JMIR
Mental Health. [DOI:10.2196/26513]
Mollahosseini, A., et al. (2019). AffectNet: A Database for
Facial Expression Recognition. IEEE Transactions on
Affective Computing.
[DOI:10.1109/TAFFC.2017.2740923]
Nguyen, T., & Kim, H. (2022). Adaptive Frame Skipping
for Efficient Video Emotion Recognition. CVPR
Workshops. [OpenAccess]
Sharma, R. (2023). High-FPS Facial Analysis Using
OpenCV-DNN Module. Journal of Real-Time Image
Processing. [DOI:10.1007/s11554-023-01285-9]
Wang, Y., & Guan, L. (2022). Multimodal Emotion-Aware
Music Recommendation Systems. IEEE Multimedia.
[DOI:10.1109/MMUL.2022.3159527]
Whittaker, M. (2021). Facial Analysis Ethics: Privacy-
Preserving Emotion Recognition. AI Ethics Journal.
[DOI:10.1007/s43681-021-00074-z]Raji, I. D., et al.
(2022). Auditing Demographic Bias in FER Systems.
FAT* Conference. [DOI:10.1145/3531146.3533085]
Zafeiriou, S., et al. (2023). The 4th Facial Expression
Recognition Grand Challenge. FG 2023.
[DOI:10.1109/FG57933.2023.10042732]