ARFT-Transformer: Modeling Metric Dependencies for Cross-Project Aging-Related Bug Prediction

📅 2026-01-01
🏛️ Journal of Systems and Software
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study addresses the challenges of cross-project aging-related defect prediction, where distribution discrepancies between source and target domains and severe class imbalance hinder performance, and existing approaches often overlook inter-metric dependencies. To tackle these issues, this work proposes the ARFT-Transformer framework, which introduces a metric-level multi-head attention mechanism—the first of its kind in this task—to explicitly model dependencies among software metrics. Additionally, Focal Loss is integrated to enhance learning from hard-to-classify samples. Extensive experiments on three large-scale open-source projects demonstrate that ARFT-Transformer achieves significant improvements over state-of-the-art methods, with Balance scores increasing by 29.54% in single-source and 19.92% in multi-source scenarios, respectively.

Technology Category

Application Category

Problem

Research questions and friction points this paper is trying to address.

cross-project
Aging-Related Bugs
class imbalance
domain adaptation
metric dependencies
Innovation

Methods, ideas, or system contributions that make the work stand out.

Transformer
metric dependencies
cross-project prediction
Focal Loss
software aging
🔎 Similar Papers
No similar papers found.
Shuning Ge
Shuning Ge
Ph.D candidate, Massachusetts Institute of Technology
Political Science and Statistics
F
Fangyun Qin
College of Information Engineering, Capital Normal University, Beijing, China
X
Xiaohui Wan
Suzhou Aerospace Information Research Institute, Suzhou, China
Y
Yang Liu
School of Mechanical, Electronic and Control Engineering, Beijing Jiaotong University, Beijing, China
Q
Qian Dai
Beijing Institute of Computer Technology and Applications, Beijing, China
Zheng Zheng
Zheng Zheng
北京航空航天大学 教授
软件可靠性,人工智能