Identifying Explanation Needs: Towards a Catalog of User-based Indicators

📅 2025-06-20
📈 Citations: 1
Influential: 0
📄 PDF
🤖 AI Summary
A critical challenge in explainable AI (XAI) and human-AI collaboration is determining *when* to provide explanations—i.e., real-time identification of genuine explanation needs—yet existing approaches rely on static, subjective assumptions and fail to dynamically capture users’ contextual demands. Method: We propose the first holistic, real-time explanation-need recognition framework integrating user behavior, system events, and physiological-emotional signals. Through systematic literature synthesis and empirical validation, we identify and verify 39 measurable, generalizable, and triggerable user-side indicators. We organize them into a cross-dimensional taxonomy (behavioral, system-event, and affective/physiological) and develop a demand-type mapping model. Contribution/Results: Grounded in online experiments, self-reports, and qualitative coding, we establish a structured indicator catalog comprising 17 behavioral, 8 system-event, and 14 affective/physiological measures, and design its runtime telemetry integration. The framework enables dynamic, precise, and temporally appropriate automated explanation triggering, validated in both prototype and production environments.

Technology Category

Application Category

📝 Abstract
In today's digitalized world, where software systems are becoming increasingly ubiquitous and complex, the quality aspect of explainability is gaining relevance. A major challenge in achieving adequate explanations is the elicitation of individual explanation needs, as it may be subject to severe hypothetical or confirmation biases. To address these challenges, we aim to establish user-based indicators concerning user behavior or system events that can be captured at runtime to determine when a need for explanations arises. In this work, we conducted explorative research in form of an online study to collect self-reported indicators that could indicate a need for explanation. We compiled a catalog containing 17 relevant indicators concerning user behavior, 8 indicators concerning system events and 14 indicators concerning emotional states or physical reactions. We also analyze the relationships between these indicators and different types of need for explanation. The established indicators can be used in the elicitation process through prototypes, as well as after publication to gather requirements from already deployed applications using telemetry and usage data. Moreover, these indicators can be used to trigger explanations at appropriate moments during the runtime.
Problem

Research questions and friction points this paper is trying to address.

Eliciting individual explanation needs in complex software systems
Identifying user behavior and system events indicating explanation needs
Developing a catalog of indicators to trigger timely explanations
Innovation

Methods, ideas, or system contributions that make the work stand out.

User behavior indicators for explanation needs
System event triggers for dynamic explanations
Emotional state analysis to prompt explanations
🔎 Similar Papers
No similar papers found.