DynamicRTL: RTL Representation Learning for Dynamic Circuit Behavior

📅 2025-11-12
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing GNN approaches model only the static structural topology of RTL circuits, failing to capture multi-cycle dynamic execution behaviors—thereby limiting circuit verification and optimization performance. To address this, we propose DR-GNN: the first graph neural network that jointly encodes static control-data flow graphs (CDFGs) and dynamic simulation traces to explicitly model operation-level temporal dependencies. We further construct the first large-scale dynamic RTL circuit dataset and enable cross-task transfer learning. Extensive experiments demonstrate that DR-GNN significantly outperforms state-of-the-art methods in branch hit-rate and toggle-rate prediction. Moreover, it exhibits strong generalization capability in power estimation and assertion prediction. By unifying static structure and dynamic behavior in representation learning, DR-GNN establishes a novel paradigm for behavior-aware circuit representation learning.

Technology Category

Application Category

📝 Abstract
There is a growing body of work on using Graph Neural Networks (GNNs) to learn representations of circuits, focusing primarily on their static characteristics. However, these models fail to capture circuit runtime behavior, which is crucial for tasks like circuit verification and optimization. To address this limitation, we introduce DR-GNN (DynamicRTL-GNN), a novel approach that learns RTL circuit representations by incorporating both static structures and multi-cycle execution behaviors. DR-GNN leverages an operator-level Control Data Flow Graph (CDFG) to represent Register Transfer Level (RTL) circuits, enabling the model to capture dynamic dependencies and runtime execution. To train and evaluate DR-GNN, we build the first comprehensive dynamic circuit dataset, comprising over 6,300 Verilog designs and 63,000 simulation traces. Our results demonstrate that DR-GNN outperforms existing models in branch hit prediction and toggle rate prediction. Furthermore, its learned representations transfer effectively to related dynamic circuit tasks, achieving strong performance in power estimation and assertion prediction.
Problem

Research questions and friction points this paper is trying to address.

Capturing dynamic circuit behavior beyond static characteristics
Learning RTL representations with multi-cycle execution patterns
Addressing circuit verification and optimization through dynamic dependencies
Innovation

Methods, ideas, or system contributions that make the work stand out.

DR-GNN integrates static structures with multi-cycle execution behaviors
Uses operator-level CDFG to capture dynamic dependencies and runtime execution
Leverages comprehensive dataset with Verilog designs and simulation traces
🔎 Similar Papers
No similar papers found.
Ruiyang Ma
Ruiyang Ma
School of Computer Science, Peking University
EDAHardware Verification
Yunhao Zhou
Yunhao Zhou
Shanghai Jiao Tong University
EDAGNNLLM
Yipeng Wang
Yipeng Wang
College of Computer Science, Beijing University of Technology
Computer NetworksArtificial Intelligence
Y
Yi Liu
The Chinese University of Hong Kong
Z
Zheng-Hao Shi
The Chinese University of Hong Kong
Ziyang Zheng
Ziyang Zheng
Shanghai Jiao Tong University
Signal ProcessingInverse ProblemPhotonic Computing
Kexin Chen
Kexin Chen
CUHK
LLM/VLMsAI AgentMulti-modality LearningAI for Science
Z
Zhiqiang He
Nanjing University of Aeronautics and Astronautics
L
Lingwei Yan
Nanjing University of Aeronautics and Astronautics
G
Gang Chen
Nanjing University of Aeronautics and Astronautics
Q
Qiang Xu
The Chinese University of Hong Kong
Guojie Luo
Guojie Luo
Peking University
Electronic Design AutomationReconfigurable Architecture