PAND: Prompt-Aware Neighborhood Distillation for Lightweight Fine-Grained Visual Classification

📅 2026-02-08
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the limitation in fine-grained visual classification performance when distilling large vision-language models into lightweight networks, a problem exacerbated by fixed prompts and global alignment strategies. To overcome this, the authors propose PAND, a two-stage knowledge distillation framework. First, prompt-aware semantic calibration generates adaptive semantic anchors; then, neighborhood-aware structural distillation enforces local decision structure consistency in the student model. The approach innovatively decouples semantic calibration from structural transfer by integrating a prompt-adaptive mechanism with neighborhood structural constraints. Evaluated on four fine-grained visual categorization (FGVC) benchmarks, PAND significantly outperforms existing methods, achieving 76.09% accuracy on CUB-200 with a ResNet-18 student model—representing a 3.4% improvement over the VL2Lite baseline.

Technology Category

Application Category

📝 Abstract
Distilling knowledge from large Vision-Language Models (VLMs) into lightweight networks is crucial yet challenging in Fine-Grained Visual Classification (FGVC), due to the reliance on fixed prompts and global alignment. To address this, we propose PAND (Prompt-Aware Neighborhood Distillation), a two-stage framework that decouples semantic calibration from structural transfer. First, we incorporate Prompt-Aware Semantic Calibration to generate adaptive semantic anchors. Second, we introduce a neighborhood-aware structural distillation strategy to constrain the student's local decision structure. PAND consistently outperforms state-of-the-art methods on four FGVC benchmarks. Notably, our ResNet-18 student achieves 76.09% accuracy on CUB-200, surpassing the strong baseline VL2Lite by 3.4%. Code is available at https://github.com/LLLVTA/PAND.
Problem

Research questions and friction points this paper is trying to address.

Fine-Grained Visual Classification
Knowledge Distillation
Vision-Language Models
Lightweight Networks
Prompt Dependency
Innovation

Methods, ideas, or system contributions that make the work stand out.

Prompt-Aware Semantic Calibration
Neighborhood-Aware Distillation
Fine-Grained Visual Classification
Vision-Language Model Distillation
Lightweight Network
🔎 Similar Papers
No similar papers found.