ConformalNL2LTL: Translating Natural Language Instructions into Temporal Logic Formulas with Conformal Correctness Guarantees

πŸ“… 2025-04-22
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
This work addresses the challenge of automatically and reliably translating natural language (NL) instructions into Linear Temporal Logic (LTL) formulas, thereby lowering the expertise barrier for manually encoding LTL specifications in robotic task planning. We propose the first iterative NL2LTL framework integrating conformal prediction (CP) with large language models (LLMs), enabling users to specify a target translation success rate (e.g., 90%) and dynamically balance full automation against human-in-the-loop verification. Leveraging open-vocabulary question-answering and distribution-free uncertainty quantification, our method provides theoretically grounded, confidence-controllable correctness guarantees. Empirical evaluation demonstrates significantly lower human assistance request rates compared to baselines, while maintaining strong generalization across unseen domains and instruction patterns. The core contribution is the novel application of conformal prediction to NL2LTLβ€”enabling verifiable, confidence-calibrated semantic translation with adjustable reliability.

Technology Category

Application Category

πŸ“ Abstract
Linear Temporal Logic (LTL) has become a prevalent specification language for robotic tasks. To mitigate the significant manual effort and expertise required to define LTL-encoded tasks, several methods have been proposed for translating Natural Language (NL) instructions into LTL formulas, which, however, lack correctness guarantees. To address this, we introduce a new NL-to-LTL translation method, called ConformalNL2LTL, that can achieve user-defined translation success rates over unseen NL commands. Our method constructs LTL formulas iteratively by addressing a sequence of open-vocabulary Question-Answering (QA) problems with LLMs. To enable uncertainty-aware translation, we leverage conformal prediction (CP), a distribution-free uncertainty quantification tool for black-box models. CP enables our method to assess the uncertainty in LLM-generated answers, allowing it to proceed with translation when sufficiently confident and request help otherwise. We provide both theoretical and empirical results demonstrating that ConformalNL2LTL achieves user-specified translation accuracy while minimizing help rates.
Problem

Research questions and friction points this paper is trying to address.

Translating natural language to LTL with correctness guarantees
Reducing manual effort in defining robotic tasks
Using conformal prediction for uncertainty-aware translation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses LLMs for open-vocabulary QA in LTL translation
Leverages conformal prediction for uncertainty quantification
Achieves user-specified translation accuracy with minimal help
πŸ”Ž Similar Papers
No similar papers found.
J
Jun Wang
Department of Electrical and Systems Engineering, Washington University in St Louis, MO, 63108, USA
David Smith Sundarsingh
David Smith Sundarsingh
PhD Student, Washington University in St.Louis
Formal MethodsControl Theory
Jyotirmoy V. Deshmukh
Jyotirmoy V. Deshmukh
Associate Professor, University of Southern California
Cyberphysical systemsFormal/Statistical VerificationTemporal logicAI safetyReinforcement learning
Y
Y. Kantaros
Department of Electrical and Systems Engineering, Washington University in St Louis, MO, 63108, USA