Analysis of Linear Mode Connectivity via Permutation-Based Weight Matching

📅 2024-02-06
🏛️ arXiv.org
📈 Citations: 4
Influential: 1
📄 PDF
🤖 AI Summary
This paper investigates the underlying causes of linear mode connectivity (LMC) and its relationship with weight matching (WM). Addressing two key questions—whether WM achieves LMC solely by reducing the $L^2$ distance between models, and how it ensures low-loss interpolation paths—the work provides the first analysis from the perspective of singular vector alignment. It reveals that WM preserves singular values while aligning the directions of dominant left and right singular vectors, thereby achieving structural alignment between functionally equivalent models. This mechanism is provably equivalent to activation matching (AM) and outperforms data-dependent straight-through estimators (STE). Empirically, the study demonstrates that WM induces LMC without requiring $L^2$-distance compression. Moreover, it unifies theoretical explanations for model merging and SGD generalization. The findings establish a geometric foundation for WM-based model interpolation and offer new insights into the loss landscape geometry of deep neural networks.

Technology Category

Application Category

📝 Abstract
Recently, Ainsworth et al. showed that using weight matching (WM) to minimize the $L^2$ distance in a permutation search of model parameters effectively identifies permutations that satisfy linear mode connectivity (LMC), where the loss along a linear path between two independently trained models with different seeds remains nearly constant. This paper analyzes LMC using WM, which is useful for understanding stochastic gradient descent's effectiveness and its application in areas like model merging. We first empirically show that permutations found by WM do not significantly reduce the $L^2$ distance between two models, and the occurrence of LMC is not merely due to distance reduction by WM itself. We then demonstrate that permutations can change the directions of the singular vectors, but not the singular values, of the weight matrices in each layer. This finding shows that permutations found by WM primarily align the directions of singular vectors associated with large singular values across models. This alignment brings the singular vectors with large singular values, which determine the model's functionality, closer between the original and merged models, allowing the merged model to retain functionality similar to the original models, thereby satisfying LMC. This paper also analyzes activation matching (AM) in terms of singular vectors and finds that the principle of AM is likely the same as that of WM. Finally, we analyze the difference between WM and the straight-through estimator (STE), a dataset-dependent permutation search method, and show that WM can be more advantageous than STE in achieving LMC among three or more models.
Problem

Research questions and friction points this paper is trying to address.

Analyzes linear mode connectivity via weight matching permutations
Explores singular vector alignment for model functionality retention
Compares weight matching with dataset-dependent permutation methods
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses weight matching for permutation search
Aligns singular vectors of weight matrices
Compares weight matching with activation matching
🔎 Similar Papers
No similar papers found.
A
Akira Ito
Nippon Telegraph and Telephone Corporation, 3–9–11 Midori-cho, Musashino-shi, Tokyo, Japan
Masanori Yamada
Masanori Yamada
NTT
Deep Learning
Atsutoshi Kumagai
Atsutoshi Kumagai
NTT
machine learningdata miningcyber security