🤖 AI Summary
This work addresses scalability and stability challenges in computing optimal transport maps and Wasserstein barycenters in high-dimensional spaces. We propose an end-to-end framework based on Conditional Normalizing Flows (CNFs), which jointly maps multiple source distributions into a shared latent space via invertible implicit transport—directly modeling the transport maps and enabling closed-form barycenter computation, thereby circumventing conventional dual optimization and adversarial training. To our knowledge, this is the first method to employ CNFs for joint learning of multi-source transport maps, supporting efficient barycenter estimation across up to hundreds of input distributions. Experiments demonstrate that our approach significantly outperforms existing state-of-the-art methods on high-dimensional tasks, achieving substantial improvements in accuracy, computational efficiency, and training stability.
📝 Abstract
We present a novel method for efficiently computing optimal transport maps and Wasserstein barycenters in high-dimensional spaces. Our approach uses conditional normalizing flows to approximate the input distributions as invertible pushforward transformations from a common latent space. This makes it possible to directly solve the primal problem using gradient-based minimization of the transport cost, unlike previous methods that rely on dual formulations and complex adversarial optimization. We show how this approach can be extended to compute Wasserstein barycenters by solving a conditional variance minimization problem. A key advantage of our conditional architecture is that it enables the computation of barycenters for hundreds of input distributions, which was computationally infeasible with previous methods. Our numerical experiments illustrate that our approach yields accurate results across various high-dimensional tasks and compares favorably with previous state-of-the-art methods.