Evaluating and Addressing Fairness Across User Groups in Negative Sampling for Recommender Systems

πŸ“… 2023-04-15
πŸ›οΈ Proceedings of the 34th ACM International Conference on Information and Knowledge Management
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
In implicit-feedback recommendation systems, negative sampling is prone to bias toward active users, resulting in insufficiently informative negative samples for inactive users and exacerbating inter-group disparities in recommendation quality. This work identifies a systematic fairness bias in existing negative samplers with respect to user activity levels and proposes the first adaptive negative sampling framework explicitly designed for user-activity grouping. Grounded in empirical statistical modeling, the framework dynamically adjusts the negative sample allocation ratio across activity-defined user groups. Evaluated across eight state-of-the-art negative samplers, our method significantly improves Recall@K for inactive users (average +12.7%) while preserving or slightly enhancing performance for active users. To the best of our knowledge, this is the first approach that jointly optimizes user-side fairness and overall recommendation accuracy.
πŸ“ Abstract
Recommender systems trained on implicit feedback data rely on negative sampling to distinguish positive items from negative items for each user. Since the majority of positive interactions come from a small group of active users, negative samplers are often impacted by data imbalance, leading them to choose more informative negatives for prominent users while providing less useful ones for users who are not so active. This leads to inactive users being further marginalised in the training process, thus receiving inferior recommendations. In this paper, we conduct a comprehensive empirical study demonstrating that state-of-the-art negative sampling strategies provide more accurate recommendations for active users than for inactive users. We also find that increasing the number of negative samples for each positive item improves the average performance, but the benefit is distributed unequally across user groups, with active users experiencing performance gain while inactive users suffering performance degradation. To address this, we propose a group-specific negative sampling strategy that assigns smaller negative ratios to inactive user groups and larger ratios to active groups. Experiments on eight negative samplers show that our approach improves user-side fairness and performance when compared to a uniform global ratio.
Problem

Research questions and friction points this paper is trying to address.

Recommender systems face fairness issues in negative sampling due to data imbalance
Inactive users receive inferior recommendations compared to active users
Current negative sampling strategies distribute performance benefits unequally across user groups
Innovation

Methods, ideas, or system contributions that make the work stand out.

Group-specific negative sampling strategy for fairness
Assigns smaller negative ratios to inactive users
Larger negative ratios for active user groups
πŸ”Ž Similar Papers
No similar papers found.
Yueqing Xuan
Yueqing Xuan
RMIT University
FairnessRecommender Systems
Kacper Sokol
Kacper Sokol
ETH ZΓΌrich
InterpretabilityExplainabilityTransparencyMachine LearningArtificial Intelligence
M
M. Sanderson
RMIT University, Melbourne, Victoria, Australia
J
Jeffrey Chan
RMIT University, Melbourne, Victoria, Australia