Leveraging Auto-Distillation and Generative Self-Supervised Learning in Residual Graph Transformers for Enhanced Recommender Systems

📅 2025-04-08
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address insufficient modeling of collaborative patterns, unstable representations, and weak interpretability in recommender systems, this paper proposes a novel framework integrating generative self-supervised learning with a residual graph transformer. Our key contributions are: (1) a rationale-aware generative self-supervised pretraining paradigm that explicitly models users’ decision rationales; (2) a residual graph transformer that jointly captures global topological structure and local interaction stability; and (3) an automatic knowledge distillation mechanism that extracts cross-domain-consistent collaborative logic. Extensive experiments on multiple public benchmarks demonstrate that our method achieves 3.2–5.7% AUC improvements over state-of-the-art baselines. The distilled signals exhibit strong transferability and intrinsic interpretability, offering a new paradigm for trustworthy recommendation.

Technology Category

Application Category

📝 Abstract
This paper introduces a cutting-edge method for enhancing recommender systems through the integration of generative self-supervised learning (SSL) with a Residual Graph Transformer. Our approach emphasizes the importance of superior data enhancement through the use of pertinent pretext tasks, automated through rationale-aware SSL to distill clear ways of how users and items interact. The Residual Graph Transformer incorporates a topology-aware transformer for global context and employs residual connections to improve graph representation learning. Additionally, an auto-distillation process refines self-supervised signals to uncover consistent collaborative rationales. Experimental evaluations on multiple datasets demonstrate that our approach consistently outperforms baseline methods.
Problem

Research questions and friction points this paper is trying to address.

Enhancing recommender systems with generative SSL
Improving graph representation via Residual Graph Transformer
Auto-distillation for consistent collaborative rationales
Innovation

Methods, ideas, or system contributions that make the work stand out.

Generative SSL with Residual Graph Transformer
Auto-distillation for refining self-supervised signals
Topology-aware transformer with residual connections
🔎 Similar Papers
No similar papers found.
E
Eya Mhedhbi
Declic AI Research
Youssef Mourchid
Youssef Mourchid
Research & Associate Professor - CESI LINEACT UR7527
Computer VisionMachine/Deep LearningComplex Networks
A
Alice Othmani
Université Paris-Est, LISSI, UPEC, 94400 Vitry sur Seine, France