🤖 AI Summary
Quantifier instantiation in SMT solving remains challenging due to the undecidability of quantified formulas, rendering systematic instance generation difficult. While existing techniques—including e-matching, grammar-guided instantiation, and model-based approaches—are complementary, they lack a dynamic coordination mechanism. This paper introduces the first framework that formalizes quantifier instantiation as an implicit language generation problem. We structurally model the term space using probabilistic context-free grammars (PCFGs) and propose a novel synthesis of grammar induction and backward sampling to balance exploitation of known patterns with exploration of diverse instantiations. The method supports online learning and dynamic scheduling of multiple instantiation strategies. Experimental evaluation on standard benchmarks demonstrates substantial improvements in both solving efficiency and problem coverage, outperforming state-of-the-art solvers including Z3 and CVC5.
📝 Abstract
Quantified formulas pose a significant challenge for Satisfiability Modulo Theories (SMT) solvers due to their inherent undecidability. Existing instantiation techniques, such as e-matching, syntax-guided, model-based, conflict-based, and enumerative methods, often complement each other. This paper introduces a novel instantiation approach that dynamically learns from these techniques during solving. By treating observed instantiations as samples from a latent language, we use probabilistic context-free grammars to generate new, similar terms. Our method not only mimics successful past instantiations but also explores diversity by optionally inverting learned term probabilities, aiming to balance exploitation and exploration in quantifier reasoning.