ICL-Router: In-Context Learned Model Representations for LLM Routing

📅 2025-10-10
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing LLM routing methods rely on fine-tuned model representations, necessitating full router retraining for each newly added model—severely limiting scalability. To address this, we propose ICL-Router: a zero-shot, fine-tuning-free routing framework that leverages in-context learning (ICL) to generate universal model capability vectors. It constructs query-model joint representations via two-stage semantic alignment—(1) projecting query embeddings into a shared space and (2) modeling ICL-induced capability vectors—and predicts model performance for dynamic routing. Our key contribution is the first use of ICL vectors as plug-and-play, general-purpose model capability representations, enabling zero-shot integration of unseen models without architectural or training modifications. Experiments demonstrate that ICL-Router achieves state-of-the-art routing accuracy on both in-distribution and out-of-distribution benchmarks, significantly improving generalization and scalability over prior approaches.

Technology Category

Application Category

📝 Abstract
Large language models (LLMs) often exhibit complementary strengths. Model routing harnesses these strengths by dynamically directing each query to the most suitable model, given a candidate model pool. However, routing performance relies on accurate model representations, and adding new models typically requires retraining, limiting scalability. To address these challenges, we propose a novel routing method using in-context vectors to represent model capabilities. The method proceeds in two stages. First, queries are embedded and projected into vectors, with a projector and LLM-based router trained to reconstruct the original queries, aligning vector representations with the router's semantic space. Second, each candidate model is profiled on a query set, and the router learns -- based on in-context vectors of query and model performance -- to predict whether each model can correctly answer new queries. Extensive experiments demonstrate that our method achieves state-of-the-art routing performance in both in-distribution and out-of-distribution tasks. Moreover, our method allows for seamless integration of new models without retraining the router. The code is available at https://github.com/lalalamdbf/ICL-Router.
Problem

Research questions and friction points this paper is trying to address.

Dynamic routing of queries to optimal LLMs
Learning model capabilities via in-context vectors
Adding new models without router retraining
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses in-context vectors to represent model capabilities
Learns to predict model performance on new queries
Integrates new models without retraining the router
🔎 Similar Papers
No similar papers found.
C
Chenxu Wang
Shanghai Artificial Intelligence Laboratory, Fudan University
H
Hao Li
Shanghai Artificial Intelligence Laboratory, Northwestern Polytechnical University
Y
Yiqun Zhang
Shanghai Artificial Intelligence Laboratory, Northeastern University
L
Linyao Chen
Shanghai Artificial Intelligence Laboratory, Nanjing University
J
Jianhao Chen
Shanghai Artificial Intelligence Laboratory, The University of Tokyo
Ping Jian
Ping Jian
Beijing Institute of Technology
natural language processingmachine learning
P
Peng Ye
Shanghai Artificial Intelligence Laboratory
Qiaosheng Zhang
Qiaosheng Zhang
Department of Anesthesiology, New York University School of Medcine
NeuroscienceNeural EngineeringPain CircuitryBrain Machine InterfaceNeural Signal Processing
Shuyue Hu
Shuyue Hu
Shanghai Artificial Intelligence Lab
multiagent systemlarge language modelgame theory