Do Users' Explainability Needs in Software Change with Mood?

📅 2025-02-10
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study investigates how users’ affective traits and demographic characteristics influence their demand for explanations of software user interfaces (UIs)—specifically, explanation frequency and type. Employing a survey methodology integrated with validated psychological instruments (e.g., the Emotion Reactivity Scale, ERS), alongside correlation analysis and statistical modeling, the study yields the first empirical evidence that emotion reactivity positively predicts UI explanation demand, whereas age negatively predicts it; other variables were non-significant. These findings demonstrate that explainability requirements are highly contextualized and individualized, resisting accurate modeling solely via objective demographic variables. Critically, the research establishes affective traits as a pivotal factor in explainable AI (XAI) design, challenging the implicit assumption of universal explanation strategies. It thereby provides both theoretical grounding and practical guidance for developing personalized, adaptive XAI systems—urging organizations to adopt dynamic, user-state–driven mechanisms for real-time explanation需求 elicitation and delivery.

Technology Category

Application Category

📝 Abstract
Context and Motivation: The increasing complexity of modern software systems often challenges users' abilities to interact with them. Taking established quality attributes such as usability and transparency into account can mitigate this problem, but often do not suffice to completely solve it. Recently, explainability has emerged as essential non-functional requirement to help overcome the aforementioned difficulties. Question/problem: User preferences regarding the integration of explanations in software differ. Neither too few nor too many explanations are helpful. In this paper, we investigate the influence of a user's subjective mood and objective demographic aspects on explanation needs by means of frequency and type of explanation. Principal ideas/results: Our results reveal a limited relationship between these factors and explanation needs. Two significant correlations were identified: Emotional reactivity was positively correlated with the need for UI explanations, while a negative correlation was found between age and user interface needs. Contribution: As we only find very few significant aspects that influence the need for explanations, we conclude that the need for explanations is very subjective and does only partially depend on objective factors. These findings emphasize the necessity for software companies to actively gather user-specific explainability requirements to address diverse and context-dependent user demands. Nevertheless, future research should explore additional personal traits and cross-cultural factors to inform the development of adaptive, user-centered explanation systems.
Problem

Research questions and friction points this paper is trying to address.

Investigate mood's impact on software explanation needs
Analyze demographic factors influencing explanation preferences
Explore subjective vs objective factors in user explainability
Innovation

Methods, ideas, or system contributions that make the work stand out.

User mood impacts explanation needs
Emotional reactivity increases UI explanations
Age negatively affects UI explanation needs
🔎 Similar Papers
No similar papers found.