Bridging External and Parametric Knowledge: Mitigating Hallucination of LLMs with Shared-Private Semantic Synergy in Dual-Stream Knowledge

📅 2025-06-06
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the problem of exacerbated hallucinations and degraded performance in retrieval-augmented generation (RAG) caused by conflicts between externally retrieved knowledge and the LLM’s parametric knowledge, this paper proposes DSSP-RAG—a dual-stream semantic synergy framework. It introduces a novel hybrid attention mechanism that decouples shared and private semantics via a shared-private decomposition; incorporates unsupervised hallucination detection through cognitive uncertainty modeling; and designs an Energy Quotient (EQ) metric—computed from attention difference matrices—to quantify and suppress noise in external knowledge. The method is fully end-to-end trainable and requires no human annotations. Evaluated on multiple benchmarks, DSSP-RAG significantly outperforms strong baselines: average hallucination rate decreases by 32.7%, factual consistency improves by 28.4%, and knowledge conflict is effectively mitigated, enhancing both generation accuracy and stability.

Technology Category

Application Category

📝 Abstract
Retrieval-augmented generation (RAG) is a cost-effective approach to mitigate the hallucination of Large Language Models (LLMs) by incorporating the retrieved external knowledge into the generation process. However, external knowledge may conflict with the parametric knowledge of LLMs. Furthermore, current LLMs lack inherent mechanisms for resolving such knowledge conflicts, making traditional RAG methods suffer from degraded performance and stability. Thus, we propose a Dual-Stream Knowledge-Augmented Framework for Shared-Private Semantic Synergy (DSSP-RAG). Central to the framework is a novel approach that refines self-attention into a mixed-attention, distinguishing shared and private semantics for a controlled internal-external knowledge integration. To effectively facilitate DSSP in RAG, we further introduce an unsupervised hallucination detection method based on cognitive uncertainty, ensuring the necessity of introducing knowledge, and an Energy Quotient (EQ) based on attention difference matrices to reduce noise in the retrieved external knowledge. Extensive experiments on benchmark datasets show that DSSP-RAG can effectively resolve conflicts and enhance the complementarity of dual-stream knowledge, leading to superior performance over strong baselines.
Problem

Research questions and friction points this paper is trying to address.

Mitigating LLM hallucination via external-parametric knowledge synergy
Resolving conflicts between external and parametric LLM knowledge
Enhancing RAG performance with dual-stream semantic integration
Innovation

Methods, ideas, or system contributions that make the work stand out.

Dual-Stream Knowledge-Augmented Framework for semantic synergy
Mixed-attention mechanism distinguishing shared-private semantics
Unsupervised hallucination detection using cognitive uncertainty
🔎 Similar Papers
No similar papers found.