InF-ATPG: Intelligent FFR-Driven ATPG with Advanced Circuit Representation Guided Reinforcement Learning

📅 2025-11-25
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Traditional ATPG methods suffer from excessive runtime and limited fault coverage, while existing machine learning–based approaches—such as reinforcement learning (RL) with severe reward delay and graph neural networks (GNNs) with inadequate circuit representation capability—exhibit critical limitations. To address these challenges, this paper proposes an intelligent test vector generation framework integrating fanout-free region (FFR) partitioning, a dedicated graph neural network (QGNN), and RL. We innovatively design an FFR-driven circuit decomposition strategy to mitigate reward sparsity in RL training and develop a QGNN architecture specifically tailored for ATPG to enhance gate-level structural modeling. Experimental results demonstrate that our method reduces backtracking counts by 55.06% on average compared to conventional ATPG, and by 38.31% relative to state-of-the-art ML-based ATPG approaches, while achieving significantly higher fault coverage.

Technology Category

Application Category

📝 Abstract
Automatic test pattern generation (ATPG) is a crucial process in integrated circuit (IC) design and testing, responsible for efficiently generating test patterns. As semiconductor technology progresses, traditional ATPG struggles with long execution times to achieve the expected fault coverage, which impacts the time-to-market of chips. Recent machine learning techniques, like reinforcement learning (RL) and graph neural networks (GNNs), show promise but face issues such as reward delay in RL models and inadequate circuit representation in GNN-based methods. In this paper, we propose InF-ATPG, an intelligent FFR-driven ATPG framework that overcomes these challenges by using advanced circuit representation to guide RL. By partitioning circuits into fanout-free regions (FFRs) and incorporating ATPG-specific features into a novel QGNN architecture, InF-ATPG enhances test pattern generation efficiency. Experimental results show InF-ATPG reduces backtracks by 55.06% on average compared to traditional methods and 38.31% compared to the machine learning approach, while also improving fault coverage.
Problem

Research questions and friction points this paper is trying to address.

Improves test pattern generation efficiency in IC design
Reduces execution time and backtracks for ATPG
Enhances fault coverage with advanced circuit representation
Innovation

Methods, ideas, or system contributions that make the work stand out.

FFR partitioning for circuit simplification
QGNN with ATPG-specific feature integration
Advanced circuit representation guiding reinforcement learning
🔎 Similar Papers
No similar papers found.
B
Bin Sun
State Key Lab of Processors, Institute of Computing Technology, Chinese Academy of Sciences; University of Chinese Academy of Sciences
R
Rengang Zhang
State Key Lab of Processors, Institute of Computing Technology, Chinese Academy of Sciences; University of Chinese Academy of Sciences
Zhiteng Chao
Zhiteng Chao
SKLP, ICT
computer science
Z
Zizhen Liu
State Key Lab of Processors, Institute of Computing Technology, Chinese Academy of Sciences; University of Chinese Academy of Sciences
Jianan Mu
Jianan Mu
Institute of Computing Technology, State Key Laboratory of Processors (SKLP), CAS
Design AutomationAccelaretorPrivacy Preserving Computing
J
Jing Ye
State Key Lab of Processors, Institute of Computing Technology, Chinese Academy of Sciences; University of Chinese Academy of Sciences; CASTEST Co., Ltd.
Huawei Li
Huawei Li
Institute of Computing Technology, Chinese Academy of Sciences
computer engineering