Wasserstein Transfer Learning

📅 2025-05-23
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing transfer learning frameworks are restricted to scalar or vector outputs in Euclidean space, failing to address regression tasks where outputs are probability distributions. Method: This paper introduces the first transfer learning framework for distributional regression within the Wasserstein space. It proposes a Wasserstein-distance-based regression model, accompanied by a provably convergent estimator and a data-driven source-domain adaptive weighting scheme that quantifies inter-domain similarity and mitigates negative transfer. Contribution/Results: Theoretically, the estimation convergence rate is governed by the Wasserstein distance between source and target domains. Empirically, the method achieves significant improvements in predictive accuracy on both synthetic and real-world datasets. Crucially, it demonstrates robustness under two practical settings: when source-domain information is partially known or entirely unknown.

Technology Category

Application Category

📝 Abstract
Transfer learning is a powerful paradigm for leveraging knowledge from source domains to enhance learning in a target domain. However, traditional transfer learning approaches often focus on scalar or multivariate data within Euclidean spaces, limiting their applicability to complex data structures such as probability distributions. To address this, we introduce a novel framework for transfer learning in regression models, where outputs are probability distributions residing in the Wasserstein space. When the informative subset of transferable source domains is known, we propose an estimator with provable asymptotic convergence rates, quantifying the impact of domain similarity on transfer efficiency. For cases where the informative subset is unknown, we develop a data-driven transfer learning procedure designed to mitigate negative transfer. The proposed methods are supported by rigorous theoretical analysis and are validated through extensive simulations and real-world applications.
Problem

Research questions and friction points this paper is trying to address.

Extends transfer learning to probability distributions in Wasserstein space
Proves asymptotic convergence rates for known transferable domains
Prevents negative transfer via data-driven domain selection
Innovation

Methods, ideas, or system contributions that make the work stand out.

Wasserstein space for distribution outputs
Estimator with asymptotic convergence rates
Data-driven procedure to prevent negative transfer
🔎 Similar Papers
No similar papers found.