Injecting Knowledge Graphs into Large Language Models

πŸ“… 2025-05-12
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
To address the weak symbolic reasoning capability of large language models (LLMs) stemming from insufficient structured knowledge support, this paper proposes KG-Injectβ€”a lightweight integration method that directly injects knowledge graph embeddings (KGEs) as learnable tokens into the LLM’s input layer. KG-Inject achieves, for the first time, deep fusion between KGEs and LLM input encoding without modifying model parameters or requiring fine-tuning, thereby preserving knowledge structure fidelity while maintaining computational efficiency. Its model-agnostic design enables plug-and-play compatibility with arbitrary open- or closed-source LLMs. Extensive experiments on both synthetic and real-world datasets demonstrate that KG-Inject significantly improves logical reasoning accuracy, outperforming state-of-the-art knowledge-augmented methods in both overall precision and inference efficiency.

Technology Category

Application Category

πŸ“ Abstract
Integrating structured knowledge from Knowledge Graphs (KGs) into Large Language Models (LLMs) remains a key challenge for symbolic reasoning. Existing methods mainly rely on prompt engineering or fine-tuning, which lose structural fidelity or incur high computational costs. Building on recent encoding techniques which integrate graph embeddings within the LLM input as tokens, we extend this paradigm to the KG domain by leveraging Knowledge Graph Embedding (KGE) models, thus enabling graph-aware reasoning. Our approach is model-agnostic, resource-efficient, and compatible with any LLMs. Extensive experimentation on synthetic and real-world datasets shows that our method improves reasoning performance over established baselines, further achieving the best trade-off in terms of accuracy and efficiency against state-of-the-art LLMs.
Problem

Research questions and friction points this paper is trying to address.

Integrating Knowledge Graphs into LLMs for symbolic reasoning
Overcoming structural fidelity loss and high computational costs
Enhancing reasoning performance with graph-aware, resource-efficient methods
Innovation

Methods, ideas, or system contributions that make the work stand out.

Integrates Knowledge Graph Embeddings as LLM tokens
Model-agnostic and resource-efficient KG-LLM fusion
Enhances reasoning accuracy with graph-aware inputs
πŸ”Ž Similar Papers
No similar papers found.