BézierFlow: Bézier Stochastic Interpolant Schedulers for Few-Step Generation

📅 2025-12-15
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the challenge of few-step generation in pretrained diffusion and flow models, this paper proposes BézierFlow—a lightweight training method that, for the first time, models the sampling scheduler as a Bézier function satisfying boundary constraints, monotonic signal-to-noise ratio (SNR), and differentiability, thereby enabling optimization over continuous sampling trajectories rather than discrete timesteps. By combining Bézier parameterization with stochastic interpolation scheduling, BézierFlow achieves efficient trajectory learning with only 15 minutes of fine-tuning. Under ≤10 function evaluations (NFE), it delivers 2–3× sampling acceleration across diverse pretrained models, significantly outperforming existing timestep-learning approaches while preserving generation quality. The core contribution lies in extending scheduler optimization from discrete temporal grids to a continuous, differentiable trajectory space—enabling principled, geometry-aware control of the sampling path.

Technology Category

Application Category

📝 Abstract
We introduce BézierFlow, a lightweight training approach for few-step generation with pretrained diffusion and flow models. BézierFlow achieves a 2-3x performance improvement for sampling with $leq$ 10 NFEs while requiring only 15 minutes of training. Recent lightweight training approaches have shown promise by learning optimal timesteps, but their scope remains restricted to ODE discretizations. To broaden this scope, we propose learning the optimal transformation of the sampling trajectory by parameterizing stochastic interpolant (SI) schedulers. The main challenge lies in designing a parameterization that satisfies critical desiderata, including boundary conditions, differentiability, and monotonicity of the SNR. To effectively meet these requirements, we represent scheduler functions as Bézier functions, where control points naturally enforce these properties. This reduces the problem to learning an ordered set of points in the time range, while the interpretation of the points changes from ODE timesteps to Bézier control points. Across a range of pretrained diffusion and flow models, BézierFlow consistently outperforms prior timestep-learning methods, demonstrating the effectiveness of expanding the search space from discrete timesteps to Bézier-based trajectory transformations.
Problem

Research questions and friction points this paper is trying to address.

Optimizes sampling trajectories for few-step generation
Learns optimal transformation of stochastic interpolant schedulers
Improves performance with minimal training time
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses Bu00e9zier functions to parameterize stochastic interpolant schedulers
Learns optimal transformation of sampling trajectory via control points
Achieves few-step generation with lightweight training in 15 minutes