One Model, Any Conjunctive Query: Graph Neural Networks for Answering Complex Queries over Knowledge Graphs

📅 2024-09-21
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing knowledge graph completion methods struggle with incompleteness in answering complex conjunctive queries. Method: This paper proposes the first unified single-model framework capable of handling arbitrary-structure conjunctive queries, jointly supporting answer classification and retrieval. It integrates graph neural networks, query embedding, and link prediction, and introduces a novel reinforcement learning–based training paradigm for Boolean queries, along with an answer existence detection mechanism. Contributions/Results: (1) The model achieves strong zero-shot generalization to unseen query structures—first such result for a single model; (2) it enables cross-knowledge-graph zero-shot transfer; and (3) it substantially outperforms state-of-the-art methods on a newly constructed benchmark, demonstrating both high accuracy and robustness.

Technology Category

Application Category

📝 Abstract
Traditional query answering over knowledge graphs -- or broadly over relational data -- is one of the most fundamental problems in data management. Motivated by the incompleteness of modern knowledge graphs, a new setup for query answering has emerged, where the goal is to predict answers that do not necessarily appear in the knowledge graph, but are present in its completion. In this work, we propose AnyCQ, a graph neural network model that can classify answers to any conjunctive query on any knowledge graph, following training. At the core of our framework lies a graph neural network model trained using a reinforcement learning objective to answer Boolean queries. Our approach and problem setup differ from existing query answering studies in multiple dimensions. First, we focus on the problem of query answer classification: given a query and a set of possible answers, classify these proposals as true or false relative to the complete knowledge graph. Second, we study the problem of query answer retrieval: given a query, retrieve an answer to the query relative to the complete knowledge graph or decide that no correct solutions exist. Trained on simple, small instances, AnyCQ can generalize to large queries of arbitrary structure, reliably classifying and retrieving answers to samples where existing approaches fail, which is empirically validated on new and challenging benchmarks. Furthermore, we demonstrate that our AnyCQ models effectively transfer to out-of-distribution knowledge graphs, when equipped with a relevant link predictor, highlighting their potential to serve as a general engine for query answering.
Problem

Research questions and friction points this paper is trying to address.

Predict answers missing in incomplete knowledge graphs
Classify and retrieve answers to any conjunctive query
Generalize to large queries with arbitrary structures
Innovation

Methods, ideas, or system contributions that make the work stand out.

Graph Neural Network for query answering
Reinforcement learning for Boolean queries
Transfer learning to novel knowledge graphs
🔎 Similar Papers
No similar papers found.
K
Krzysztof Olejniczak
Department of Computer Science, University of Oxford
Xingyue Huang
Xingyue Huang
University of Oxford
Graph neural networkKnowledge graphsMachine learningDeep learning
I
Ismail Ilkan Ceylan
Department of Computer Science, University of Oxford
Mikhail Galkin
Mikhail Galkin
Research Scientist, Google
Graph Machine LearningKnowledge GraphsDeep LearningGeometric Deep Learning