🤖 AI Summary
This work addresses three key challenges of Gromov–Wasserstein (GW) and its partial variant (PGW) in structural matching across metric measure spaces: non-convexity, rigid equal-mass constraints, and high computational complexity. We propose Linearized Partial Gromov–Wasserstein (LPGW), the first scalable linearized embedding method for PGW distances. LPGW reduces pairwise distance computation among K metric measure spaces from O(K²) to O(K), and rigorously proves that the resulting embedding satisfies all metric axioms. Our approach integrates relaxed partial matching modeling, explicit embedding construction, and rigorous metric-theoretic analysis—preserving both the structural expressivity and robustness of PGW. Empirically, LPGW achieves significant speedups on shape retrieval and transport-driven representation learning tasks, while maintaining or even surpassing PGW’s accuracy. The implementation is publicly available.
📝 Abstract
The Gromov-Wasserstein (GW) problem, a variant of the classical optimal transport (OT) problem, has attracted growing interest in the machine learning and data science communities due to its ability to quantify similarity between measures in different metric spaces. However, like the classical OT problem, GW imposes an equal mass constraint between measures, which restricts its application in many machine learning tasks. To address this limitation, the partial Gromov-Wasserstein (PGW) problem has been introduced. It relaxes the equal mass constraint, allowing the comparison of general positive Radon measures. Despite this, both GW and PGW face significant computational challenges due to their non-convex nature. To overcome these challenges, we propose the linear partial Gromov-Wasserstein (LPGW) embedding, a linearized embedding technique for the PGW problem. For $K$ different metric measure spaces, the pairwise computation of the PGW distance requires solving the PGW problem ${O}(K^2)$ times. In contrast, the proposed linearization technique reduces this to ${O}(K)$ times. Similar to the linearization technique for the classical OT problem, we prove that LPGW defines a valid metric for metric measure spaces. Finally, we demonstrate the effectiveness of LPGW in practical applications such as shape retrieval and learning with transport-based embeddings, showing that LPGW preserves the advantages of PGW in partial matching while significantly enhancing computational efficiency. The code is available at https://github.com/mint-vu/Linearized_Partial_Gromov_Wasserstein.