Escalating LLM-based Code Translation Benchmarking into the Class-level Era

📅 2024-11-09
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing LLM-based code translation evaluation focuses on method- or statement-level short snippets, neglecting realistic, class-level, dependency-intensive migration scenarios in industrial development. Method: We introduce ClassEval-T, the first industrially grounded, class-level code translation benchmark for Java/C++ migration, featuring complete test suites and a multi-strategy evaluation framework. We formally define three translation strategies—holistic, minimal-dependency, and standalone—and conduct 360 person-hours of manual migration with systematic failure analysis. Contribution/Results: Our study reveals fundamental LLM limitations in dependency awareness and contextual modeling: class-level translation accuracy drops by over 30% on average compared to method-level tasks, with substantial inter-model performance variance. We construct a fine-grained, 1243-case failure taxonomy, providing an empirical foundation for advancing code translation research and tooling.

Technology Category

Application Category

📝 Abstract
In recent years, Large Language Models (LLMs) have dramatically advanced the performance of automated code translation, making their computational accuracy score reach up to over 80% on many previous benchmarks. However, most code samples in these benchmarks are short, standalone, statement/method-level, and algorithmic, which is not aligned with practical coding tasks. Therefore, it is still unknown the actual capability of LLMs in translating code samples written for daily development. To achieve this, we construct a class-level code translation benchmark, ClassEval-T, and make the first attempt to extensively assess recent LLMs' performance on class-level code translation. ClassEval-T is extended from ClassEval, a well-known class-level Python code generation benchmark consisting of multiple practical coding topics, such as database operation and game design, and diverse contextual dependencies (e.g., fields, methods, and libraries). It cost us 360 person-hours to accomplish the manual migration to Java and C++ with complete code samples and associated test suites. Subsequently, we design three translation strategies (i.e., holistic, min-dependency, and standalone) for class-level code translations and evaluate eight recent LLMs of commercial, general, and code kinds in diverse families and sizes on ClassEval-T. Experimental results demonstrate a remarkable performance drop compared with the most widely studied method-level code translation benchmark, and obvious discrepancies among LLMs appear, showing the effectiveness of ClassEval-T in measuring recent LLMs. Afterwards, we further discuss the usage scenarios for diverse translation strategies and LLMs' ability to dependency awareness when translating class samples. Finally, 1,243 failure cases made by the best-performing LLM under test are analyzed and categorized in this paper for practical guidance and future enlightenment.
Problem

Research questions and friction points this paper is trying to address.

Assessing LLMs' performance on class-level code translation
Evaluating translation strategies for practical coding tasks
Analyzing failure cases to guide future LLM improvements
Innovation

Methods, ideas, or system contributions that make the work stand out.

Developed ClassEval-T for class-level code translation benchmarking.
Implemented three translation strategies for class-level code.
Evaluated eight LLMs on ClassEval-T for performance analysis.
🔎 Similar Papers
No similar papers found.
P
Pengyu Xue
Shandong University, China
L
Linhao Wu
Shandong University, China
Chengyi Wang
Chengyi Wang
Bytedance Inc
Large language model
X
Xiang Li
Shandong University, China
Z
Zhen Yang
Shandong University, China
R
Ruikai Jin
Shandong University, China
Y
Yuxiang Zhang
Shandong University, China
J
Jia Li
School of Computer Science, Peking University, China
Y
Yifei Pei
Shandong University, China
Zhaoyan Shen
Zhaoyan Shen
Shandong University
X
Xiran Lyu
Shandong University, China