🤖 AI Summary
This work addresses the challenge of unifying discrete and continuous conditional variables within Conditional Optimal Transport (COT). To this end, we propose a neuralized COT mapping framework. Methodologically, we design a hypernetwork-driven conditional parameter generation mechanism to jointly model hybrid-type condition variables; integrate differentiable Wasserstein optimization with neural OT learning into an end-to-end trainable architecture; and incorporate a global sensitivity analysis module. Our key contribution is the first incorporation of hypernetworks into COT—departing from conventional conditional encoding paradigms—and thereby significantly enhancing the expressivity and adaptability of transport mappings. Experiments demonstrate that our approach outperforms existing COT methods across multiple benchmarks. Furthermore, it is successfully applied to black-box model interpretability analysis, efficiently generating OT-based sensitivity metrics.
📝 Abstract
We present a neural framework for learning conditional optimal transport (OT) maps between probability distributions. Our approach introduces a conditioning mechanism capable of processing both categorical and continuous conditioning variables simultaneously. At the core of our method lies a hypernetwork that generates transport layer parameters based on these inputs, creating adaptive mappings that outperform simpler conditioning methods. Comprehensive ablation studies demonstrate the superior performance of our method over baseline configurations. Furthermore, we showcase an application to global sensitivity analysis, offering high performance in computing OT-based sensitivity indices. This work advances the state-of-the-art in conditional optimal transport, enabling broader application of optimal transport principles to complex, high-dimensional domains such as generative modeling and black-box model explainability.