Circuit Transformer: A Transformer That Preserves Logical Equivalence

📅 2024-03-14
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing neural approaches for generating equivalent logic circuits from Boolean functions struggle to guarantee strict logical equivalence. Method: We propose the first generative neural model with formal equivalence guarantees. Our approach introduces an equivalence-aware symbolic decoding mechanism endowed with a “truncation property,” formulates circuit generation as a logic-constrained Markov decision process, and integrates Transformer architecture, symbolic constraint-guided decoding, equivalence-verification–driven token generation, and reinforcement learning optimization—jointly optimizing for both functional equivalence and structural compactness. Results: Evaluated on synthetic and real-world benchmarks, our 88M-parameter model achieves 100% equivalence (zero violations) while producing significantly more compact circuits, consistently outperforming all prior neural methods in both correctness and efficiency.

Technology Category

Application Category

📝 Abstract
Implementing Boolean functions with circuits consisting of logic gates is fundamental in digital computer design. However, the implemented circuit must be exactly equivalent, which hinders generative neural approaches on this task due to their occasionally wrong predictions. In this study, we introduce a generative neural model, the"Circuit Transformer", which eliminates such wrong predictions and produces logic circuits strictly equivalent to given Boolean functions. The main idea is a carefully designed decoding mechanism that builds a circuit step-by-step by generating tokens, which has beneficial"cutoff properties"that block a candidate token once it invalidate equivalence. In such a way, the proposed model works similar to typical LLMs while logical equivalence is strictly preserved. A Markov decision process formulation is also proposed for optimizing certain objectives of circuits. Experimentally, we trained an 88-million-parameter Circuit Transformer to generate equivalent yet more compact forms of input circuits, outperforming existing neural approaches on both synthetic and real world benchmarks, without any violation of equivalence constraints.
Problem

Research questions and friction points this paper is trying to address.

Boolean functions
Equivalent circuit generation
Neural network limitations
Innovation

Methods, ideas, or system contributions that make the work stand out.

Circuit Transformer
Logical Equivalence
Compact Circuit Generation
🔎 Similar Papers
No similar papers found.
Xihan Li
Xihan Li
University College London, London, UK
X
Xing Li
Huawei Noah’s Ark Lab, Hong Kong, China
L
Lei Chen
Huawei Noah’s Ark Lab, Hong Kong, China
X
Xing Zhang
Huawei Noah’s Ark Lab, Hong Kong, China
M
Mingxuan Yuan
Huawei Noah’s Ark Lab, Hong Kong, China
J
Jun Wang
University College London, London, UK