On the minimax optimality of Flow Matching through the connection to kernel density estimation

📅 2025-04-17
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper investigates the statistical optimality of Flow Matching (FM) generative models under the Wasserstein distance. Methodologically, it establishes, for the first time, a theoretical connection between FM and kernel density estimation (KDE). The contributions are threefold: (1) It improves the convergence rate of Gaussian KDE beyond existing bounds; (2) Under sufficiently expressive neural network approximations, FM achieves the minimax-optimal rate (up to logarithmic factors) for Wasserstein-1 distance estimation; (3) It characterizes an accelerated convergence mechanism of FM for distributions supported on low-dimensional manifolds embedded in high dimensions—providing the first rigorous theoretical explanation for FM’s high-dimensional efficacy. Collectively, these results unify the theoretical frameworks of generative modeling and nonparametric density estimation, thereby establishing a solid statistical foundation for Flow Matching.

Technology Category

Application Category

📝 Abstract
Flow Matching has recently gained attention in generative modeling as a simple and flexible alternative to diffusion models, the current state of the art. While existing statistical guarantees adapt tools from the analysis of diffusion models, we take a different perspective by connecting Flow Matching to kernel density estimation. We first verify that the kernel density estimator matches the optimal rate of convergence in Wasserstein distance up to logarithmic factors, improving existing bounds for the Gaussian kernel. Based on this result, we prove that for sufficiently large networks, Flow Matching also achieves the optimal rate up to logarithmic factors, providing a theoretical foundation for the empirical success of this method. Finally, we provide a first justification of Flow Matching's effectiveness in high-dimensional settings by showing that rates improve when the target distribution lies on a lower-dimensional linear subspace.
Problem

Research questions and friction points this paper is trying to address.

Analyzing minimax optimality of Flow Matching in generative modeling
Connecting Flow Matching to kernel density estimation for theoretical guarantees
Demonstrating improved rates for high-dimensional low-dimensional subspace distributions
Innovation

Methods, ideas, or system contributions that make the work stand out.

Connects Flow Matching to kernel density estimation
Proves optimal convergence rate for Flow Matching
Justifies effectiveness in high-dimensional settings
🔎 Similar Papers
No similar papers found.