Predicate-Conditional Conformalized Answer Sets for Knowledge Graph Embeddings

📅 2025-05-22
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing knowledge graph embedding (KGE) methods provide only marginal coverage guarantees—i.e., average probability guarantees over query-answer pairs—rendering them inadequate for high-stakes domains like healthcare, where per-query, reliable uncertainty quantification is essential. To address this, we propose CondKGCP, the first predicate-conditioned conformal prediction framework for KGE. CondKGCP achieves approximate conditional coverage by constructing semantic-aware calibration sets via predicate vector clustering and introducing a rank-sensitive calibration strategy that simultaneously ensures theoretical coverage validity and reduces prediction set size. We formally prove its conditional coverage guarantee under mild assumptions. Empirical evaluation on benchmarks including FB15k-237 demonstrates that CondKGCP significantly improves conditional coverage over state-of-the-art baselines while reducing average prediction set size by 12%–28%, all while maintaining ≥95% answer inclusion rate.

Technology Category

Application Category

📝 Abstract
Uncertainty quantification in Knowledge Graph Embedding (KGE) methods is crucial for ensuring the reliability of downstream applications. A recent work applies conformal prediction to KGE methods, providing uncertainty estimates by generating a set of answers that is guaranteed to include the true answer with a predefined confidence level. However, existing methods provide probabilistic guarantees averaged over a reference set of queries and answers (marginal coverage guarantee). In high-stakes applications such as medical diagnosis, a stronger guarantee is often required: the predicted sets must provide consistent coverage per query (conditional coverage guarantee). We propose CondKGCP, a novel method that approximates predicate-conditional coverage guarantees while maintaining compact prediction sets. CondKGCP merges predicates with similar vector representations and augments calibration with rank information. We prove the theoretical guarantees and demonstrate empirical effectiveness of CondKGCP by comprehensive evaluations.
Problem

Research questions and friction points this paper is trying to address.

Ensures reliable uncertainty quantification in Knowledge Graph Embeddings
Achieves conditional coverage guarantees per query in KGE
Maintains compact prediction sets while improving coverage consistency
Innovation

Methods, ideas, or system contributions that make the work stand out.

Predicate-conditional conformal prediction for KGE
Merges predicates with similar vector representations
Augments calibration with rank information
🔎 Similar Papers
No similar papers found.