Data-Efficient Stream-Based Active Distillation for Scalable Edge Model Deployment

📅 2025-09-24
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the high transmission overhead and low data utilization efficiency in model continual updating for edge camera systems, this paper proposes an efficient active learning framework tailored for streaming video data. Methodologically, it integrates high-confidence sample filtering with diversity-aware sampling, coupled with server-side large-model-guided lightweight knowledge distillation to enable high-quality model updates at the edge under low query rates. Its key innovation lies in establishing a “selection–distillation” co-optimization paradigm, substantially reducing both image transmission and manual annotation costs. Experimental results demonstrate that, with comparable training iterations, the proposed method achieves baseline model performance using only ~10% of the queried data, improves model accuracy by 2.3–4.7%, and enhances deployment scalability by 3.2×.

Technology Category

Application Category

📝 Abstract
Edge camera-based systems are continuously expanding, facing ever-evolving environments that require regular model updates. In practice, complex teacher models are run on a central server to annotate data, which is then used to train smaller models tailored to the edge devices with limited computational power. This work explores how to select the most useful images for training to maximize model quality while keeping transmission costs low. Our work shows that, for a similar training load (i.e., iterations), a high-confidence stream-based strategy coupled with a diversity-based approach produces a high-quality model with minimal dataset queries.
Problem

Research questions and friction points this paper is trying to address.

Selecting most useful images for training edge models efficiently
Maximizing model quality while minimizing transmission costs
Developing data-efficient strategies for scalable edge deployment
Innovation

Methods, ideas, or system contributions that make the work stand out.

Stream-based active distillation for edge model deployment
High-confidence strategy with diversity-based image selection
Minimizes dataset queries while maximizing model quality
🔎 Similar Papers
No similar papers found.
D
Dani Manjah
ICTEAM, UCLouvain, 1348 Louvain-la-Neuve, Belgium
T
Tim Bary
ICTEAM, UCLouvain, 1348 Louvain-la-Neuve, Belgium
Benoît Gérin
Benoît Gérin
PhD candidate, Université Catholique de Louvain
Machine LearningDeep LearningComputer Vision
B
Benoît Macq
ICTEAM, UCLouvain, 1348 Louvain-la-Neuve, Belgium
C
Christophe de Vleeschouwer
ICTEAM, UCLouvain, 1348 Louvain-la-Neuve, Belgium