LLM4SBR: A Lightweight and Effective Framework for Integrating Large Language Models in Session-based Recommendation

📅 2024-02-21
🏛️ arXiv.org
📈 Citations: 1
Influential: 0
📄 PDF
🤖 AI Summary
To address data sparsity, semantic ambiguity, and poor interpretability in session-based recommendation (SBR), this paper proposes LLM4SBR—a lightweight two-stage framework. In the first stage, anonymous short sessions are modeled as dual-modal representations: textual item descriptions and sequential behavioral interactions. In the second stage, large language model (LLM)-driven multi-perspective reasoning and cross-modal alignment jointly enhance semantic understanding and decision interpretability. Key contributions include: (1) the first LLM-integrated SBR architecture designed specifically for industrial deployment; and (2) a novel dual-modal session modeling paradigm coupled with a multi-perspective semantic alignment mechanism. Extensive experiments on two real-world datasets demonstrate that LLM4SBR achieves an average 18.7% improvement in Recall@20, reduces inference latency by 62%, and cuts memory footprint by 53%, thereby delivering superior efficiency, strong generalization capability, and enhanced interpretability.

Technology Category

Application Category

📝 Abstract
Traditional session-based recommendation (SBR) utilizes session behavior sequences from anonymous users for recommendation. Although this strategy is highly efficient, it sacrifices the inherent semantic information of the items, making it difficult for the model to understand the true intent of the session and resulting in a lack of interpretability in the recommended results. Recently, large language models (LLMs) have flourished across various domains, offering a glimpse of hope in addressing the aforementioned challenges. Inspired by the impact of LLMs, research exploring the integration of LLMs with the Recommender system (RS) has surged like mushrooms after rain. However, constrained by high time and space costs, as well as the brief and anonymous nature of session data, the first LLM recommendation framework suitable for industrial deployment has yet to emerge in the field of SBR. To address the aforementioned challenges, we have proposed the LLM Integration Framework for SBR (LLM4SBR). Serving as a lightweight and plug-and-play framework, LLM4SBR adopts a two-step strategy. Firstly, we transform session data into a bimodal form of text and behavior. In the first step, leveraging the inferential capabilities of LLMs, we conduct inference on session text data from different perspectives and design the component for auxiliary enhancement. In the second step, the SBR model is trained on behavior data, aligning and averaging two modal session representations from different perspectives. Finally, we fuse session representations from different perspectives and modalities as the ultimate session representation for recommendation. We conducted experiments on two real-world datasets, and the results demonstrate that LLM4SBR significantly improves the performance of traditional SBR models and is highly lightweight and efficient, making it suitable for industrial deployment.
Problem

Research questions and friction points this paper is trying to address.

Addresses sparse session data in recommendation systems
Integrates semantic and behavioral signals for intent capture
Reduces LLM training costs for lightweight SBR frameworks
Innovation

Methods, ideas, or system contributions that make the work stand out.

Multi-view prompts infer latent user intentions
Intent localization module reduces LLM hallucinations
Aligns semantic inferences with behavioral representations
🔎 Similar Papers
No similar papers found.
S
Shutong Qiao
Chongqing University
C
Chen Gao
Tsinghua University
J
Junhao Wen
Chongqing University
W
Wei Zhou
Chongqing University
Q
Qun Luo
Tencent
P
Peixuan Chen
Tencent
Y
Yong Li
Tsinghua University