MINGLE: Mixtures of Null-Space Gated Low-Rank Experts for Test-Time Continual Model Merging

📅 2025-05-17
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address catastrophic forgetting and poor adaptation to new tasks—caused by severe parameter interference and weak test-distribution adaptability in data-free continual model fusion—we propose a test-time online dynamic fusion framework. Our approach introduces three key innovations: (1) a spatially constrained gating mechanism that mitigates interference from old-task parameters via spatial orthogonality; (2) an adaptive relaxation strategy that dynamically tunes constraint strength using a small set of unlabeled test samples; and (3) a low-rank mixture-of-experts architecture integrated with parameter-efficient fine-tuning, jointly optimizing forgetting suppression and robustness to distribution shifts. Evaluated on standard continual fusion benchmarks, our method achieves an average performance gain of 7–9% over state-of-the-art approaches, significantly alleviating forgetting while enhancing cross-task generalization robustness.

Technology Category

Application Category

📝 Abstract
Continual model merging integrates independently fine-tuned models sequentially without access to original training data, providing a scalable and efficient solution to continual learning. However, current methods still face critical challenges, notably parameter interference among tasks and limited adaptability to evolving test distributions. The former causes catastrophic forgetting of integrated tasks, while the latter hinders effective adaptation to new tasks. To address these, we propose MINGLE, a novel framework for test-time continual model merging, which leverages test-time adaptation using a small set of unlabeled test samples from the current task to dynamically guide the merging process. MINGLE employs a mixture-of-experts architecture composed of parameter-efficient, low-rank experts, enabling efficient adaptation and improving robustness to distribution shifts. To mitigate catastrophic forgetting, we propose Null-Space Constrained Gating, which restricts gating updates to subspaces orthogonal to prior task representations. This suppresses activations on old task inputs and preserves model behavior on past tasks. To further balance stability and adaptability, we design an Adaptive Relaxation Strategy, which dynamically adjusts the constraint strength based on interference signals captured during test-time adaptation. Extensive experiments on standard continual merging benchmarks demonstrate that MINGLE achieves robust generalization, reduces forgetting significantly, and consistently surpasses previous state-of-the-art methods by 7-9% on average across diverse task orders.
Problem

Research questions and friction points this paper is trying to address.

Addresses parameter interference and test distribution adaptability in continual model merging
Mitigates catastrophic forgetting via Null-Space Constrained Gating
Balances stability and adaptability with Adaptive Relaxation Strategy
Innovation

Methods, ideas, or system contributions that make the work stand out.

Mixtures of low-rank experts for efficient adaptation
Null-Space Constrained Gating to prevent forgetting
Adaptive Relaxation Strategy for dynamic constraint adjustment
🔎 Similar Papers
No similar papers found.
Zihuan Qiu
Zihuan Qiu
PhD student, University of Electronic Science and Technology of China
Continual learningDeep LearningComputer Vision
Y
Yi Xu
Dalian University of Technology, Dalian, China
Chiyuan He
Chiyuan He
Ph.D student. University of Electronic Science and Technology of China
deep learningcontinual learningactivity recognitionvision-language model
F
Fanman Meng
University of Electronic Science and Technology of China, Chengdu, China
L
Linfeng Xu
University of Electronic Science and Technology of China, Chengdu, China
Qingbo Wu
Qingbo Wu
University of Electronic Science and Technology of China
video codingimage and video quality assessment
H
Hongliang Li
University of Electronic Science and Technology of China, Chengdu, China