
formance, achieving an accuracy of 0.902, precision
of 0.958, recall of 0.821, F1 of 0.884, and AUC of
0.956. These results demonstrate that eye-based met-
rics, including fixations, saccades, and pupil diam-
eter, are sufficient for reliable binary classification,
thereby eliminating the need for multimodal inputs.
Beyond predictive performance, the findings high-
light the feasibility of deploying lightweight gaze-
based models in real-time HCI systems. Unlike mul-
timodal approaches, this method offers a focused and
interpretable solution based solely on ocular behavior.
As an exploratory study based on nine participants,
the findings provide initial evidence of the discrimi-
native power of gaze-only features. Larger and more
diverse datasets are necessary to confirm the gener-
alizability and establish statistical reliability. Future
work should also extend the evaluation to subject-
independent scenarios, incorporate temporal model-
ing of gaze dynamics, and explore personalization
and cross-task generalization to advance robust and
adaptive cognitive monitoring systems.
REFERENCES
Abbad-Andaloussi, A., Sorg, T., and Weber, B. (2022). Es-
timating developers’ cognitive load at a fine-grained
level using eye-tracking measures. In Proceedings of
the 30th IEEE/ACM international conference on pro-
gram comprehension, pages 111–121.
Aksu, S¸. H., C¸ akıt, E., and Da
˘
gdeviren, M. (2024). Men-
tal workload assessment using machine learning tech-
niques based on eeg and eye tracking data. Applied
Sciences, 14(6):2282.
Chen, J., Zhao, F., Sun, Y., and Yin, Y. (2020a). Improved
xgboost model based on genetic algorithm. Interna-
tional Journal of Computer Applications in Technol-
ogy, 62(3):240–245.
Chen, S., Webb, G. I., Liu, L., and Ma, X. (2020b). A
novel selective na
¨
ıve bayes algorithm. Knowledge-
Based Systems, 192:105361.
Cinar, A. C. (2020). Training feed-forward multi-layer per-
ceptron artificial neural networks with a tree-seed al-
gorithm. Arabian Journal for Science and Engineer-
ing, 45(12):10915–10938.
Dang, Q., Kucukosmanoglu, M., Anoruo, M., Kargosha, G.,
Conklin, S., and Brooks, J. (2024). Auto detecting
cognitive events using machine learning on pupillary
data. arXiv preprint arXiv:2410.14174.
Das, A. (2024). Logistic regression. In Encyclopedia of
quality of life and well-being research, pages 3985–
3986. Springer.
Ding, L., Terwilliger, J., Parab, A., Wang, M., Fridman, L.,
Mehler, B., and Reimer, B. (2023). Clera: a unified
model for joint cognitive load and eye region analysis
in the wild. ACM Transactions on Computer-Human
Interaction, 30(6):1–23.
Duchowski, A. T. (2007). Eye tracking methodology: The-
ory and practice. Springer Science & Business Media.
Ghosh, S., Dhall, A., Hayat, M., Knibbe, J., and Ji, Q.
(2023). Automatic gaze analysis: A survey of deep
learning based approaches. IEEE Transactions on Pat-
tern Analysis and Machine Intelligence, 46(1):61–84.
Gorin, H., Patel, J., Qiu, Q., Merians, A., Adamovich,
S., and Fluet, G. (2024). A review of the use of
gaze and pupil metrics to assess mental workload in
gamified and simulated sensorimotor tasks. Sensors,
24(6):1759.
Holmqvist, K., Nystr
¨
om, M., et al. (2011). Eye tracking: A
comprehensive guide to methods and measures. Ox-
ford University Press.
Khan, M. A., Asadi, H., Qazani, M. R. C., Lim, C. P., and
Nahavandi, S. (2024). Functional near-infrared spec-
troscopy (fnirs) and eye tracking for cognitive load
classification in a driving simulator using deep learn-
ing. arXiv preprint arXiv:2408.06349.
Kosch, T., Karolus, J., Zagermann, J., Reiterer, H., Schmidt,
A., and Wo
´
zniak, P. W. (2023). A survey on mea-
suring cognitive workload in human-computer inter-
action. ACM Computing Surveys, 55(13s):1–39.
Ktistakis, E., Skaramagkas, V., Manousos, D., Tachos,
N. S., Tripoliti, E., Fotiadis, D. I., and Tsiknakis, M.
(2022). Colet: A dataset for cognitive workload esti-
mation based on eye-tracking. Computer Methods and
Programs in Biomedicine, 224:106989.
Mannaru, P., Balasingam, B., Pattipati, K., Sibley, C., and
Coyne, J. T. (2017). Performance evaluation of the
gazepoint gp3 eye tracking device based on pupil di-
lation. In Augmented Cognition. Neurocognition and
Machine Learning: 11th International Conference,
AC 2017, Held as Part of HCI International 2017,
Vancouver, BC, Canada, July 9-14, 2017, Proceed-
ings, Part I 11, pages 166–175. Springer.
Nasri, M., Kosa, M., Chukoskie, L., Moghaddam, M., and
Harteveld, C. (2024). Exploring eye tracking to de-
tect cognitive load in complex virtual reality training.
In 2024 IEEE International Symposium on Mixed and
Augmented Reality Adjunct (ISMAR-Adjunct), pages
51–54. IEEE.
Pisner, D. A. and Schnyer, D. M. (2020). Support vector
machine. In Machine learning, pages 101–121. Else-
vier.
Rayner, K. (1998). Eye movements in reading and informa-
tion processing: 20 years of research. Psychological
bulletin, 124(3):372–422.
Skaramagkas, V., Ktistakis, E., Manousos, D., Kazantzaki,
E., Tachos, N. S., Tripoliti, E., Fotiadis, D. I., and
Tsiknakis, M. (2023). esee-d: Emotional state esti-
mation based on eye-tracking dataset. Brain Sciences,
13(4):589.
Skaramagkas, V., Ktistakis, E., Manousos, D., Tachos,
N. S., Kazantzaki, E., Tripoliti, E. E., Fotiadis, D. I.,
and Tsiknakis, M. (2021). Cognitive workload level
estimation based on eye tracking: A machine learning
approach. In 2021 IEEE 21st International Confer-
ence on Bioinformatics and Bioengineering (BIBE),
pages 1–5. IEEE.
Szczepaniak, D., Harvey, M., and Deligianni, F. (2024).
Predictive modelling of cognitive workload in vr: An
eye-tracking approach. In Proceedings of the 2024
Symposium on Eye Tracking Research and Applica-
tions, pages 1–3.
WEBIST 2025 - 21st International Conference on Web Information Systems and Technologies
572