🤖 AI Summary
This work addresses the fundamental trade-off between sampling efficiency and training stability in generative modeling. We propose a simulation-free, single-step feature generation framework. Methodologically, we construct a provably first-order mapping by solving the probability flow ODE, thereby unifying GANs’ efficient sampling with normalizing flows’ training stability; the velocity field is estimated via nonparametric regression, and the feature curve is jointly approximated using Euler discretization and deep neural networks. Theoretically, we establish the first non-asymptotic 2-Wasserstein convergence rate bound for simulation-free single-step generative models, refining existing error analyses for flow-based models. Empirically, our method achieves high-fidelity generation on both synthetic and real-world datasets using only a single forward pass—demonstrating both theoretical rigor and practical efficacy.
📝 Abstract
We propose the characteristic generator, a novel one-step generative model that combines the efficiency of sampling in Generative Adversarial Networks (GANs) with the stable performance of flow-based models. Our model is driven by characteristics, along which the probability density transport can be described by ordinary differential equations (ODEs). Specifically, We estimate the velocity field through nonparametric regression and utilize Euler method to solve the probability flow ODE, generating a series of discrete approximations to the characteristics. We then use a deep neural network to fit these characteristics, ensuring a one-step mapping that effectively pushes the prior distribution towards the target distribution. In the theoretical aspect, we analyze the errors in velocity matching, Euler discretization, and characteristic fitting to establish a non-asymptotic convergence rate for the characteristic generator in 2-Wasserstein distance. To the best of our knowledge, this is the first thorough analysis for simulation-free one step generative models. Additionally, our analysis refines the error analysis of flow-based generative models in prior works. We apply our method on both synthetic and real datasets, and the results demonstrate that the characteristic generator achieves high generation quality with just a single evaluation of neural network.