🤖 AI Summary
To address weak domain generalization and sparse annotations in biomedical named entity recognition (NER), this paper proposes an end-to-end framework integrating knowledge graph (KG) distillation. Our method introduces a novel two-stage mechanism: “KG distillation + entity-aware enhancement.” First, a two-stage knowledge distillation process injects structured KG embeddings into a lightweight graph neural network (GNN), enabling knowledge-guided fine-grained relational modeling. Second, we design a context-enhanced entity encoder to improve entity classification and ambiguity resolution. Evaluated on multiple biomedical NER benchmarks, our approach achieves state-of-the-art performance, outperforming advanced fine-tuned models and large language models by 3.2–5.8 F1 points. It demonstrates significantly improved robustness and generalization under low-resource and cross-domain settings.
📝 Abstract
Named Entity Recognition (NER) is a fundamental task in Natural Language Processing (NLP) that plays a crucial role in information extraction, question answering, and knowledge-based systems. Traditional deep learning-based NER models often struggle with domain-specific generalization and suffer from data sparsity issues. In this work, we introduce Knowledge Graph distilled for Named Entity Recognition (KoGNER), a novel approach that integrates Knowledge Graph (KG) distillation into NER models to enhance entity recognition performance. Our framework leverages structured knowledge representations from KGs to enrich contextual embeddings, thereby improving entity classification and reducing ambiguity in entity detection. KoGNER employs a two-step process: (1) Knowledge Distillation, where external knowledge sources are distilled into a lightweight representation for seamless integration with NER models, and (2) Entity-Aware Augmentation, which integrates contextual embeddings that have been enriched with knowledge graph information directly into GNN, thereby improving the model's ability to understand and represent entity relationships. Experimental results on benchmark datasets demonstrate that KoGNER achieves state-of-the-art performance, outperforming finetuned NER models and LLMs by a significant margin. These findings suggest that leveraging knowledge graphs as auxiliary information can significantly improve NER accuracy, making KoGNER a promising direction for future research in knowledge-aware NLP.