Fair Learning for Bias Mitigation and Quality Optimization in Paper Recommendation

📅 2026-03-12
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study addresses systemic biases in peer review arising from authors’ demographic attributes—such as race and nationality—that disproportionately disadvantage scholars from underrepresented groups. To mitigate this, the authors propose Fair-PaperRec, a novel model that jointly optimizes academic quality and intersectional fairness through an end-to-end framework, eschewing heuristic approaches. The model employs a multilayer perceptron architecture coupled with a custom-designed fairness-aware loss function to enforce equitable treatment across intersecting identity dimensions. Experimental evaluations on submission data from SIGCHI, DIS, and IUI conferences demonstrate that Fair-PaperRec simultaneously improves overall utility by 3.16% and increases participation from disadvantaged groups by 42.03%, thereby substantiating that enhancing diversity need not compromise scholarly rigor.

Technology Category

Application Category

📝 Abstract
Despite frequent double-blind review, demographic biases of authors still disadvantage the underrepresented groups. We present Fair-PaperRec, a MultiLayer Perceptron (MLP)-based model that addresses demographic disparities in post-review paper acceptance decisions while maintaining high-quality requirements. Our methodology penalizes demographic disparities while preserving quality through intersectional criteria (e.g., race, country) and a customized fairness loss, in contrast to heuristic approaches. Evaluations using conference data from ACM Special Interest Group on Computer-Human Interaction (SIGCHI), Designing Interactive Systems (DIS), and Intelligent User Interfaces (IUI) indicate a 42.03% increase in underrepresented group participation and a 3.16% improvement in overall utility, indicating that diversity promotion does not compromise academic rigor and supports equity-focused peer review solutions.
Problem

Research questions and friction points this paper is trying to address.

demographic bias
fairness
paper recommendation
peer review
underrepresented groups
Innovation

Methods, ideas, or system contributions that make the work stand out.

fairness-aware recommendation
demographic bias mitigation
intersectional fairness
customized fairness loss
paper acceptance optimization
U
Uttamasha Anjally Oyshi
Department of Electrical Engineering & Computer Science, University of Arkansas, Fayetteville, USA
Susan Gauch
Susan Gauch
Professor, Computer Science and Engineering, University of Arkansas
Information RetrievalOntologiesPersonalization