🤖 AI Summary
To address the challenges of dynamic knowledge graph evolution and cross-domain reasoning, this paper introduces KG-Foundation—the first relation-centric foundation model for knowledge graphs. Methodologically, it (1) constructs a Relation-Dependency Graph (RDG) to uniformly model cross-domain relational dependencies; (2) designs a query-aware joint relation-entity attention mechanism; and (3) adopts a multi-graph pretraining with lightweight fine-tuning architecture, enabling minute-level adaptation. KG-Foundation exhibits strong inductive transfer capability, generalizing to unseen graph structures, relations, and entities. Evaluated on 31 benchmarks—including transductive, inductive, and cross-domain settings—it achieves state-of-the-art performance across all, with up to a 35% improvement in prediction accuracy over prior best methods.
📝 Abstract
Foundation models have demonstrated remarkable capabilities across various domains, but developing analogous models for knowledge graphs presents unique challenges due to their dynamic nature and the need for cross-domain reasoning. To address these issues, we introduce extbf{ extsc{GraphOracle}}, a relation-centric foundation model that unifies reasoning across knowledge graphs by converting them into Relation-Dependency Graphs (RDG), explicitly encoding compositional patterns with fewer edges than prior methods. A query-dependent attention mechanism is further developed to learn inductive representations for both relations and entities. Pre-training on diverse knowledge graphs, followed by minutes-level fine-tuning, enables effective generalization to unseen entities, relations, and entire graphs. Through comprehensive experiments on 31 diverse benchmarks spanning transductive, inductive, and cross-domain settings, we demonstrate consistent state-of-the-art performance with minimal adaptation, improving the prediction performance by up to 35% compared to the strongest baselines.