On the Collapse Errors Induced by the Deterministic Sampler for Diffusion Models

📅 2025-08-22
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work identifies a novel systematic error—“collapse errors”—in deterministic ODE sampling of diffusion models, wherein sampling trajectories abnormally concentrate in local regions of data space, severely degrading sample diversity. To characterize this phenomenon, we formally define collapse errors and introduce a differentiable local concentration metric—the first such formulation. We further discover a noise-level-dependent trade-off (“seesaw effect”) in score estimation accuracy between high- and low-noise regimes, revealing a strong coupling between deterministic sampling dynamics and score function training. Leveraging these insights, we propose a joint optimization framework that co-adapts the sampler, training objective, and network architecture. Extensive experiments across multiple datasets and model variants demonstrate that collapse errors are pervasive, and our method significantly mitigates them. This work provides both a new conceptual framework and practical tools for understanding and improving the co-design of deterministic sampling and score learning in diffusion models.

Technology Category

Application Category

📝 Abstract
Despite the widespread adoption of deterministic samplers in diffusion models (DMs), their potential limitations remain largely unexplored. In this paper, we identify collapse errors, a previously unrecognized phenomenon in ODE-based diffusion sampling, where the sampled data is overly concentrated in local data space. To quantify this effect, we introduce a novel metric and demonstrate that collapse errors occur across a variety of settings. When investigating its underlying causes, we observe a see-saw effect, where score learning in low noise regimes adversely impacts the one in high noise regimes. This misfitting in high noise regimes, coupled with the dynamics of deterministic samplers, ultimately causes collapse errors. Guided by these insights, we apply existing techniques from sampling, training, and architecture to empirically support our explanation of collapse errors. This work provides intensive empirical evidence of collapse errors in ODE-based diffusion sampling, emphasizing the need for further research into the interplay between score learning and deterministic sampling, an overlooked yet fundamental aspect of diffusion models.
Problem

Research questions and friction points this paper is trying to address.

Identifies collapse errors in deterministic diffusion model sampling
Quantifies data concentration issues in ODE-based diffusion methods
Analyzes score learning misfit between noise regimes causing errors
Innovation

Methods, ideas, or system contributions that make the work stand out.

Identifies collapse errors in deterministic diffusion samplers
Introduces novel metric to quantify data concentration issues
Proposes see-saw effect explanation for score learning misfit
🔎 Similar Papers
No similar papers found.
Y
Yi Zhang
Institute of Data Science, The University of Hong Kong
Z
Zhenyu Liao
School of Electronic Information and Communications, Huazhong University of Science and Technology
Jingfeng Wu
Jingfeng Wu
University of California, Berkeley
deep learning theorymachine learningoptimizationstatistical learning theory
Difan Zou
Difan Zou
The University of Hong Kong
Machine LearningDeep LearningOptimizationStochastic AlgorithmsSignal Processing