๐ค AI Summary
To address the high computational complexity and memory bottlenecks in computing Wasserstein distances for large-scale optimal transport, this paper proposes a projection gradient descent framework grounded in orthogonal coupling dynamics. Methodologically, it introducesโ for the first timeโthe conditional expectation as a microscopic evolution mechanism, integrating insights from opinion dynamics to construct a lightweight differential coupling system that enables scalable reconstruction of stochastic transport maps. Unlike conventional infinite-dimensional linear programming approaches, our method breaks polynomial-time and memory-storage barriers, achieving substantial gains in computational efficiency and memory scalability. Experiments demonstrate high-fidelity recovery of transport maps, enabling real-time learning and deployment of both Wasserstein distances and optimal transport plans. This work establishes a novel paradigm for large-scale optimal transport.
๐ Abstract
Many numerical algorithms and learning tasks rest on solution of the Monge-Kantorovich problem and corresponding Wasserstein distances. While the natural approach is to treat the problem as an infinite-dimensional linear programming, such a methodology severely limits the computational performance due to the polynomial scaling with respect to the sample size along with intensive memory requirements. We propose a novel alternative framework to address the Monge-Kantorovich problem based on a projection type gradient descent scheme. The micro-dynamics is built on the notion of the conditional expectation, where the connection with the opinion dynamics is explored and leveraged to build compact numerical schemes. We demonstrate that the devised dynamics recovers random maps with favourable computational performance. Along with the theoretical insight, the provided dynamics paves the way for innovative approaches to construct numerical schemes for computing optimal transport maps as well as Wasserstein distances.