🤖 AI Summary
Code generation methods relying on teacher-model distillation and static fine-tuning suffer from poor generalization and delayed feedback. Method: This paper proposes the Adaptive Critique Refinement (ACR) framework—a teacher-free, closed-loop self-evolution paradigm. ACR integrates LLM-as-a-Judge quality assessment with LLM-as-a-Critic selective critique, employing a composite scoring system to identify error patterns and drive iterative supervised fine-tuning. Crucially, it eliminates reliance on external annotations, instead leveraging only model-generated samples and external execution/semantic feedback for continuous refinement. Contribution/Results: On multiple benchmarks, the RefineCoder series—built upon ACR—significantly outperforms same-scale baselines using substantially less training data. These results empirically validate the effectiveness and scalability of self-iterative optimization for code generation.
📝 Abstract
Code generation has attracted increasing attention with the rise of Large Language Models (LLMs). Many studies have developed powerful code LLMs by synthesizing code-related instruction data and applying supervised fine-tuning. However, these methods are limited by teacher model distillation and ignore the potential of iterative refinement by self-generated code. In this paper, we propose Adaptive Critique Refinement (ACR), which enables the model to refine itself by self-generated code and external critique, rather than directly imitating the code responses of the teacher model. Concretely, ACR includes a composite scoring system with LLM-as-a-Judge to evaluate the quality of code responses and a selective critique strategy with LLM-as-a-Critic to critique self-generated low-quality code responses. We develop the RefineCoder series by iteratively applying ACR, achieving continuous performance improvement on multiple code generation benchmarks. Compared to the baselines of the same size, our proposed RefineCoder series can achieve comparable or even superior performance using less data.