EchoDistill: Bidirectional Concept Distillation for One-Step Diffusion Personalization

📅 2025-10-23
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
One-step diffusion models struggle to effectively learn novel concepts, hindering personalized text-to-image generation. To address this, we propose EchoDistill, a bidirectional concept distillation framework. It employs a multi-step diffusion model as the teacher and a one-step model as the student, introducing a novel “bidirectional concept echoing” mechanism: knowledge is distilled from teacher to student, while the student’s gradients are back-propagated to refine the teacher’s concept modeling capability. A shared text encoder ensures semantic consistency, and an adversarial alignment loss enforces cross-model representation alignment. Experiments demonstrate that EchoDistill preserves the inference speed advantage of one-step sampling while significantly improving fidelity and generalization for novel concepts—outperforming state-of-the-art personalized generation methods. This establishes an efficient, robust paradigm for single-step personalized image synthesis.

Technology Category

Application Category

📝 Abstract
Recent advances in accelerating text-to-image (T2I) diffusion models have enabled the synthesis of high-fidelity images even in a single step. However, personalizing these models to incorporate novel concepts remains a challenge due to the limited capacity of one-step models to capture new concept distributions effectively. We propose a bidirectional concept distillation framework, EchoDistill, to enable one-step diffusion personalization (1-SDP). Our approach involves an end-to-end training process where a multi-step diffusion model (teacher) and a one-step diffusion model (student) are trained simultaneously. The concept is first distilled from the teacher model to the student, and then echoed back from the student to the teacher. During the EchoDistill, we share the text encoder between the two models to ensure consistent semantic understanding. Following this, the student model is optimized with adversarial losses to align with the real image distribution and with alignment losses to maintain consistency with the teacher's output. Furthermore, we introduce the bidirectional echoing refinement strategy, wherein the student model leverages its faster generation capability to feedback to the teacher model. This bidirectional concept distillation mechanism not only enhances the student ability to personalize novel concepts but also improves the generative quality of the teacher model. Our experiments demonstrate that this collaborative framework significantly outperforms existing personalization methods over the 1-SDP setup, establishing a novel paradigm for rapid and effective personalization in T2I diffusion models.
Problem

Research questions and friction points this paper is trying to address.

Personalizing one-step diffusion models for novel concepts
Bidirectional distillation between teacher and student models
Enhancing concept learning and generation quality simultaneously
Innovation

Methods, ideas, or system contributions that make the work stand out.

Bidirectional concept distillation between teacher and student models
Shared text encoder ensures consistent semantic understanding
Adversarial and alignment losses optimize student model performance
🔎 Similar Papers
No similar papers found.
Y
Yixiong Yang
Harbin Institute of Technology (Shenzhen), China
T
Tao Wu
Computer Vision Center, Universitat Autònoma de Barcelona, Spain
Senmao Li
Senmao Li
Ph.D Student, Nankai University
GANsImage-to-image translationDiffusion Models
S
Shiqi Yang
VCIP, CS, Nankai University, China
Yaxing Wang
Yaxing Wang
Associate professor, Nankai University
Deep learningGANsImage-to-image translationTransfer learning
Joost van de Weijer
Joost van de Weijer
Computer Vision Center, Universitat Autònoma de Barcelona
Computer VisionDeep LearningContinual Learning
K
Kai Wang
Program of Computer Science, City University of Hong Kong (Dongguan), China; City University of Hong Kong, HK SAR, China