An Automatic Graph Construction Framework based on Large Language Models for Recommendation

📅 2024-12-24
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing graph construction in recommender systems relies either on manual design, oversimplified GNN-based heuristics, or LLM-driven automation that neglects global structural information and suffers from low efficiency. To address these limitations, this paper proposes AutoGraph—a novel framework that pioneers the integration of LLM-powered semantic reasoning with vector-quantized (VQ) latent factor modeling for automated graph construction. AutoGraph employs a meta-path-guided message aggregation mechanism to jointly incorporate semantic and collaborative signals, and supports model-agnostic, GNN-compatible deployment. Extensive experiments on three real-world datasets demonstrate significant improvements over state-of-the-art baselines. Deployed on Huawei’s advertising platform, AutoGraph achieves +2.69% RPM and +7.31% eCPM in online A/B tests, serving hundreds of millions of users.

Technology Category

Application Category

📝 Abstract
Graph neural networks (GNNs) have emerged as state-of-the-art methods to learn from graph-structured data for recommendation. However, most existing GNN-based recommendation methods focus on the optimization of model structures and learning strategies based on pre-defined graphs, neglecting the importance of the graph construction stage. Earlier works for graph construction usually rely on speciffic rules or crowdsourcing, which are either too simplistic or too labor-intensive. Recent works start to utilize large language models (LLMs) to automate the graph construction, in view of their abundant open-world knowledge and remarkable reasoning capabilities. Nevertheless, they generally suffer from two limitations: (1) invisibility of global view (e.g., overlooking contextual information) and (2) construction inefficiency. To this end, we introduce AutoGraph, an automatic graph construction framework based on LLMs for recommendation. Specifically, we first use LLMs to infer the user preference and item knowledge, which is encoded as semantic vectors. Next, we employ vector quantization to extract the latent factors from the semantic vectors. The latent factors are then incorporated as extra nodes to link the user/item nodes, resulting in a graph with in-depth global-view semantics. We further design metapath-based message aggregation to effectively aggregate the semantic and collaborative information. The framework is model-agnostic and compatible with different backbone models. Extensive experiments on three real-world datasets demonstrate the efficacy and efffciency of AutoGraph compared to existing baseline methods. We have deployed AutoGraph in Huawei advertising platform, and gain a 2.69% improvement on RPM and a 7.31% improvement on eCPM in the online A/B test. Currently AutoGraph has been used as the main trafffc model, serving hundreds of millions of people.
Problem

Research questions and friction points this paper is trying to address.

Graph Neural Networks
Recommendation Systems
Large Language Models
Innovation

Methods, ideas, or system contributions that make the work stand out.

AutoGraph
Large Language Model (LLM)
Graph Neural Network (GNN)
🔎 Similar Papers
No similar papers found.
R
Rong Shan
Shanghai Jiao Tong University, Shanghai, China
Jianghao Lin
Jianghao Lin
Shanghai Jiao Tong University
Large Language ModelsAI AgentsRecommender Systems
C
Chenxu Zhu
Huawei Noah’s Ark Lab, Shenzhen, China
B
Bo Chen
Huawei Noah’s Ark Lab, Shenzhen, China
M
Menghui Zhu
Huawei Noah’s Ark Lab, Shenzhen, China
Kangning Zhang
Kangning Zhang
ShangHai Jiaotong University
Data MiningRobotics Learning
J
Jieming Zhu
Huawei Noah’s Ark Lab, Shenzhen, China
R
Ruiming Tang
Huawei Noah’s Ark Lab, Shenzhen, China
Yong Yu
Yong Yu
Materials Engineer
Polymer matrix compositeadhesivemodelingtest development
W
Weinan Zhang
Shanghai Jiao Tong University, Shanghai, China