Cross-Domain Few-Shot Learning with Coalescent Projections and Latent Space Reservation

📅 2025-07-21
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In cross-domain few-shot learning (CD-FSL), fine-tuning large pre-trained Transformers with scarce labeled samples often leads to overfitting due to excessive parameter updates. To address this, we propose Coalescent Projection (CP), a lightweight fusion projection mechanism that replaces conventional soft prompts and enables feature-space alignment while keeping the backbone frozen. Additionally, we introduce a pseudo-class generation strategy leveraging only base-domain data, integrated with self-supervised transformations (SSTs), to preserve latent semantic structure and mitigate catastrophic forgetting. Our method adopts DINO as the backbone and a prototype classifier as the head, significantly enhancing generalization under extreme domain shifts. Evaluated on the BSCD-FSL benchmark, it consistently outperforms state-of-the-art methods across all settings, demonstrating superior effectiveness and robustness. The implementation is publicly available.

Technology Category

Application Category

📝 Abstract
Despite the progress in Cross-Domain Few-Shot Learning (CD-FSL), a model pre-trained with DINO combined with a prototypical classifier outperforms the latest SOTA methods. A crucial limitation that needs to be overcome is that updating too many parameters of the transformers leads to overfitting due to the scarcity of labeled samples. To address this challenge, we propose a new concept, Coalescent Projection (CP), as an effective successor to soft prompts. Additionally, we propose a novel pseudo-class generation method combined with Self-Supervised Transformations (SSTs) that relies solely on the base domain to prepare the network for encountering unseen samples from different domains. The proposed method exhibits its effectiveness in comprehensive experiments on the extreme domain shift scenario of the BSCD-FSL benchmark. Our code is published at https://github.com/Naeem-Paeedeh/CPLSR.
Problem

Research questions and friction points this paper is trying to address.

Overcoming overfitting in Cross-Domain Few-Shot Learning
Reducing parameter updates in transformers for few-shot scenarios
Enhancing generalization to unseen domains with limited labeled data
Innovation

Methods, ideas, or system contributions that make the work stand out.

Coalescent Projection replaces soft prompts
Pseudo-class generation uses Self-Supervised Transformations
Latent space reservation prevents transformer overfitting
🔎 Similar Papers
No similar papers found.