Authors:
Nikzad Chizari
1
;
Keywan Tajfar
2
and
María N. Moreno-García
1
Affiliations:
1
Department of Computer Science and Automation, University of Salamanca, Plaza de los Caídos sn, 37008 Salamanca, Spain
;
2
College of Science, School of Mathematics, Statistics, and Computer Science, Department of Statistics, University of Tehran, Tehran, Iran
Keyword(s):
Recommender Systems, Bias, Fairness, Graph Neural Networks, Metrics.
Abstract:
Recommender Systems (RS) have become a central tool for providing personalized suggestions, yet the growing complexity of modern methods, such as Graph Neural Networks (GNNs), has introduced new challenges related to bias and fairness. While these methods excel at capturing intricate relationships between users and items, they often amplify biases present in the data, leading to discriminatory outcomes especially against protected demographic groups like gender and age. This study evaluates and measures fairness in GNN-based RS by investigating the extent of unfairness towards various groups and su bgroups within these systems. By employing performance metrics like NDCG, this research highlights disparities in recommendation quality across different demographic groups, emphasizing the importance of accurate, group-level measurement. This analysis not only sheds light on how these biases manifest but also lays the groundwork for developing more equitable recommendation systems that en
sure fair treatment across all user groups.
(More)