Routing by Analogy: kNN-Augmented Expert Assignment for Mixture-of-Experts

📅 2026-01-05
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the limited generalization of routing strategies in traditional Mixture-of-Experts (MoE) architectures under distribution shifts. The authors propose kNN-MoE, the first approach to integrate case-based analogical reasoning into MoE routing by retrieving historical optimal expert assignments via k-nearest neighbors. The method combines similarity-weighted fusion of a frozen router with retrieved results and employs a confidence-driven hybrid strategy for adaptive expert selection. Notably, kNN-MoE requires no fine-tuning and achieves substantial improvements over existing baselines in zero-shot settings, matching the performance of costly supervised fine-tuning approaches.

Technology Category

Application Category

📝 Abstract
Mixture-of-Experts (MoE) architectures scale large language models efficiently by employing a parametric"router"to dispatch tokens to a sparse subset of experts. Typically, this router is trained once and then frozen, rendering routing decisions brittle under distribution shifts. We address this limitation by introducing kNN-MoE, a retrieval-augmented routing framework that reuses optimal expert assignments from a memory of similar past cases. This memory is constructed offline by directly optimizing token-wise routing logits to maximize the likelihood on a reference set. Crucially, we use the aggregate similarity of retrieved neighbors as a confidence-driven mixing coefficient, thus allowing the method to fall back to the frozen router when no relevant cases are found. Experiments show kNN-MoE outperforms zero-shot baselines and rivals computationally expensive supervised fine-tuning.
Problem

Research questions and friction points this paper is trying to address.

Mixture-of-Experts
routing
distribution shift
expert assignment
frozen router
Innovation

Methods, ideas, or system contributions that make the work stand out.

kNN-MoE
retrieval-augmented routing
Mixture-of-Experts
distribution shift robustness
memory-based expert assignment
🔎 Similar Papers
No similar papers found.