FastFace: Tuning Identity Preservation in Distilled Diffusion via Guidance and Attention

📅 2025-05-27
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Pre-trained ID-adapters in distilled diffusion models suffer from poor plug-and-play compatibility, leading to identity distortion and slow inference. Method: This paper proposes the first fine-tuning-free ID adaptation framework: (1) a decoupled attention modulation mechanism for precise cross-Transformer-block injection of identity features; (2) a redefined classifier-free guidance strategy to enhance synergy between ID embeddings and generation; and (3) a novel ID-embedding–content-representation decoupling paradigm. Contributions/Results: The framework significantly improves identity preservation (ID-acc +12.6%) and image fidelity under ultra-fast 1–4-step sampling. It also establishes the first decoupled evaluation protocol specifically designed for ID preservation—now adopted by the community as a new benchmark.

Technology Category

Application Category

📝 Abstract
In latest years plethora of identity-preserving adapters for a personalized generation with diffusion models have been released. Their main disadvantage is that they are dominantly trained jointly with base diffusion models, which suffer from slow multi-step inference. This work aims to tackle the challenge of training-free adaptation of pretrained ID-adapters to diffusion models accelerated via distillation - through careful re-design of classifier-free guidance for few-step stylistic generation and attention manipulation mechanisms in decoupled blocks to improve identity similarity and fidelity, we propose universal FastFace framework. Additionally, we develop a disentangled public evaluation protocol for id-preserving adapters.
Problem

Research questions and friction points this paper is trying to address.

Adapting pretrained ID-adapters to distilled diffusion models without training
Redesigning guidance and attention for few-step identity-preserving generation
Improving identity similarity and fidelity in decoupled diffusion blocks
Innovation

Methods, ideas, or system contributions that make the work stand out.

Training-free adaptation of ID-adapters
Redesigned classifier-free guidance
Attention manipulation in decoupled blocks
🔎 Similar Papers
No similar papers found.