Authors:
Nikzad Chizari
1
;
Keywan Tajfar
2
;
Niloufar Shoeibi
1
and
María N. Moreno-García
1
Affiliations:
1
Department of Computer Science and Automation, Science Faculty, University of Salamanca, Plaza de los Caídos sn, 37008 Salamanca, Spain
;
2
College of Science, School of Mathematics, Statistics, and Computer Science, Department of Statistics, University of Tehran, Tehran, Iran
Keyword(s):
Recommender Systems, Bias, Fairness, Graph-Based Neural Networks.
Abstract:
The wide acceptance of Recommender Systems (RS) among users for product and service suggestions has led to the proposal of multiple recommendation methods that have contributed to solving the problems presented by these systems. However, the focus on bias problems is much more limited. Some of the most successful and recent methods, such as Graph Neural Networks (GNNs), present problems of bias amplification and unfairness that need to be detected, measured, and addressed. In this study, an analysis of RS fairness is conducted, focusing on measuring unfairness toward protected groups, including gender and age. We quantify fairness disparities within these groups and evaluate recommendation quality for item lists using a metric based on Normalized Discounted Cumulative Gain (NDCG). Most bias assessment metrics in the literature are only valid for the rating prediction approach, but RS usually provide recommendations in the form of item lists. The metric for lists enhances the understa
nding of fairness dynamics in GNN-based RS, providing a more comprehensive perspective on the quality and equity of recommendations among different user groups.
(More)