Few-Shot Query Intent Detection via Relation-Aware Prompt Learning

📅 2025-09-06
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing few-shot intent detection methods overlook dialogue structural information—such as query-query and query-response relationships—leading to suboptimal performance. To address this, we propose SAID (Structure-Aware Intent Detection), a novel pretraining framework for few-shot intent detection. Its core innovation lies in (1) modeling diverse dialogue relations as learnable relation tokens for the first time, and (2) introducing QueryAdapt, a query-adaptive attention network that enables fine-grained knowledge transfer at the relation-token level. SAID unifies semantic understanding and structural dependency modeling via relation-aware prompt learning integrated with pretrained language model fine-tuning. Extensive experiments on two real-world low-resource datasets demonstrate that SAID significantly outperforms state-of-the-art approaches, validating its effectiveness, generalizability, and robustness in few-shot settings.

Technology Category

Application Category

📝 Abstract
Intent detection is a crucial component of modern conversational systems, since accurately identifying user intent at the beginning of a conversation is essential for generating effective responses. Recent efforts have focused on studying this problem under a challenging few-shot scenario. These approaches primarily leverage large-scale unlabeled dialogue text corpora to pretrain language models through various pretext tasks, followed by fine-tuning for intent detection with very limited annotations. Despite the improvements achieved, existing methods have predominantly focused on textual data, neglecting to effectively capture the crucial structural information inherent in conversational systems, such as the query-query relation and query-answer relation. To address this gap, we propose SAID, a novel framework that integrates both textual and relational structure information in a unified manner for model pretraining for the first time. Building on this framework, we further propose a novel mechanism, the query-adaptive attention network (QueryAdapt), which operates at the relation token level by generating intent-specific relation tokens from well-learned query-query and query-answer relations explicitly, enabling more fine-grained knowledge transfer. Extensive experimental results on two real-world datasets demonstrate that SAID significantly outperforms state-of-the-art methods.
Problem

Research questions and friction points this paper is trying to address.

Detecting user intent in few-shot conversational systems
Capturing structural query-query and query-answer relations
Improving knowledge transfer with limited intent annotations
Innovation

Methods, ideas, or system contributions that make the work stand out.

Integrates textual and relational structure information
Generates intent-specific relation tokens from learned relations
Enables fine-grained knowledge transfer via relation-aware attention
🔎 Similar Papers
No similar papers found.
L
Liang Zhang
Hong Kong University of Science and Technology (Guangzhou), China
Y
Yuan Li
Shenzhen MSU-BIT University, China
S
Shijie Zhang
Shenzhen MSU-BIT University, China
Z
Zheng Zhang
Harbin Institute of Technology, China
Xitong Li
Xitong Li
HEC Paris
Economics of Data and InformationHuman-AI Collaboration