Efficient Text Classification with Conformal In-Context Learning

📅 2025-12-05
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Large language models (LLMs) face challenges in text classification—including strong prompt dependency, high computational overhead, and limited generalization. To address these, this paper proposes Conformal In-Context Learning (CICLe), a novel framework that synergistically integrates a lightweight base classifier with conformal prediction to dynamically prune the candidate label set and adaptively optimize in-context learning prompts—balancing robustness and flexibility. CICLe significantly reduces prompt length (up to 25.16%) and the number of demonstration examples (up to 34.45%), enabling efficient deployment on resource-constrained models. Evaluated across diverse text classification benchmarks, CICLe consistently outperforms standard few-shot prompting methods, particularly under severe class imbalance. Empirical results confirm its cross-domain applicability and superior computational efficiency, establishing it as a scalable and reliable alternative for practical LLM-based classification.

Technology Category

Application Category

📝 Abstract
Large Language Models (LLMs) demonstrate strong in-context learning abilities, yet their effectiveness in text classification depends heavily on prompt design and incurs substantial computational cost. Conformal In-Context Learning (CICLe) has been proposed as a resource-efficient framework that integrates a lightweight base classifier with Conformal Prediction to guide LLM prompting by adaptively reducing the set of candidate classes. However, its broader applicability and efficiency benefits beyond a single domain have not yet been systematically explored. In this paper, we present a comprehensive evaluation of CICLe across diverse NLP classification benchmarks. The results show that CICLe consistently improves over its base classifier and outperforms few-shot prompting baselines when the sample size is sufficient for training the base classifier, and performs comparably in low-data regimes. In terms of efficiency, CICLe reduces the number of shots and prompt length by up to 34.45% and 25.16%, respectively, and enables the use of smaller models with competitive performance. CICLe is furthermore particularly advantageous for text classification tasks with high class imbalance. These findings highlight CICLe as a practical and scalable approach for efficient text classification, combining the robustness of traditional classifiers with the adaptability of LLMs, and achieving substantial gains in data and computational efficiency.
Problem

Research questions and friction points this paper is trying to address.

Optimizes prompt design and reduces computational costs in LLM text classification
Enhances applicability and efficiency of conformal prediction across diverse NLP tasks
Improves performance in imbalanced classification while using smaller models efficiently
Innovation

Methods, ideas, or system contributions that make the work stand out.

Integrates lightweight classifier with Conformal Prediction
Adaptively reduces candidate classes to guide LLM prompting
Uses smaller models with competitive performance efficiently
🔎 Similar Papers
No similar papers found.