🤖 AI Summary
In edge computing, low-bit zero-shot quantization (ZSQ) models suffer from weak knowledge transfer and unstable training—e.g., gradient explosion—due to the absence of real calibration data. To address this, we propose a novel data-free refined feature distillation paradigm. For the first time in ZSQ, we jointly incorporate spatial and channel dual-attention mechanisms to precisely extract and align highly discriminative features from the teacher model, thereby fundamentally mitigating gradient anomalies in low-bit networks. Our method significantly improves generalization performance of 3-/5-bit quantized models on CIFAR-10/100, achieving state-of-the-art accuracy. Compared with mainstream generative ZSQ approaches, it delivers substantial accuracy gains while ensuring greater training stability and lighter deployment overhead.
📝 Abstract
We introduce AKT (Advanced Knowledge Transfer), a novel method to enhance the training ability of low-bit quantized (Q) models in the field of zero-shot quantization (ZSQ). Existing research in ZSQ has focused on generating high-quality data from full-precision (FP) models. However, these approaches struggle with reduced learning ability in low-bit quantization due to its limited information capacity. To overcome this limitation, we propose effective training strategy compared to data generation. Particularly, we analyzed that refining feature maps in the feature distillation process is an effective way to transfer knowledge to the Q model. Based on this analysis, AKT efficiently transfer core information from the FP model to the Q model. AKT is the first approach to utilize both spatial and channel attention information in feature distillation in ZSQ. Our method addresses the fundamental gradient exploding problem in low-bit Q models. Experiments on CIFAR-10 and CIFAR-100 datasets demonstrated the effectiveness of the AKT. Our method led to significant performance enhancement in existing generative models. Notably, AKT achieved significant accuracy improvements in low-bit Q models, achieving state-of-the-art in the 3,5bit scenarios on CIFAR-10. The code is available at https://github.com/Inpyo-Hong/AKT-Advanced-knowledge-Transfer.