Transporting Task Vectors across Different Architectures without Training

๐Ÿ“… 2026-02-13
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF

Technology Category

Application Category

๐Ÿ“ Abstract
Adapting large pre-trained models to downstream tasks often produces task-specific parameter updates that are expensive to relearn for every model variant. While recent work has shown that such updates can be transferred between models with identical architectures, transferring them across models of different widths remains largely unexplored. In this work, we introduce Theseus, a training-free method for transporting task-specific updates across heterogeneous models. Rather than matching parameters directly, we characterize a task update by the functional effect it induces on intermediate representations. We formalize task-vector transport as a functional matching problem on observed activations and show that, after aligning representation spaces via orthogonal Procrustes analysis, it admits a stable closed-form solution that preserves the geometry of the update. We evaluate Theseus on vision and language models across different widths, showing consistent improvements over strong baselines without additional training or backpropagation. Our results show that task updates can be meaningfully transferred across architectures when task identity is defined functionally rather than parametrically.
Problem

Research questions and friction points this paper is trying to address.

task vector transfer
cross-architecture adaptation
training-free transfer
functional matching
heterogeneous models
Innovation

Methods, ideas, or system contributions that make the work stand out.

task vector transport
training-free adaptation
functional matching
orthogonal Procrustes analysis
cross-architecture transfer
๐Ÿ”Ž Similar Papers
No similar papers found.