Human-Centered Explainability in Interactive Information Systems: A Survey

📅 2025-07-03
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study addresses the human-centered explainability challenge in interactive information systems, aiming to enhance users’ comprehension, interpretation, and critical evaluation of AI outputs to support informed decision-making. Following the PRISMA guidelines, we systematically screened and structurally coded 100 peer-reviewed articles. Based on this analysis, we propose a novel five-dimensional conceptual model of explainability—comprising transparency, causality, relevance, controllability, and actionability—and introduce the first user-needs-driven taxonomy for explanation design. Furthermore, we identify six user-centered dimensions for explainability evaluation—the first such classification in the literature. These contributions advance explainability research from ad hoc, experience-based practice toward systematic, theory-grounded inquiry. The resulting integrative framework provides both theoretical rigor and practical guidance for designing transparent, trustworthy, and responsible interactive information systems.

Technology Category

Application Category

📝 Abstract
Human-centered explainability has become a critical foundation for the responsible development of interactive information systems, where users must be able to understand, interpret, and scrutinize AI-driven outputs to make informed decisions. This systematic survey of literature aims to characterize recent progress in user studies on explainability in interactive information systems by reviewing how explainability has been conceptualized, designed, and evaluated in practice. Following PRISMA guidelines, eight academic databases were searched, and 100 relevant articles were identified. A structural encoding approach was then utilized to extract and synthesize insights from these articles. The main contributions include 1) five dimensions that researchers have used to conceptualize explainability; 2) a classification scheme of explanation designs; 3) a categorization of explainability measurements into six user-centered dimensions. The review concludes by reflecting on ongoing challenges and providing recommendations for future exploration of related issues. The findings shed light on the theoretical foundations of human-centered explainability, informing the design of interactive information systems that better align with diverse user needs and promoting the development of systems that are transparent, trustworthy, and accountable.
Problem

Research questions and friction points this paper is trying to address.

Characterizing progress in user studies on explainability in interactive systems
Reviewing conceptualization, design, and evaluation of explainability in practice
Providing theoretical foundations for transparent and trustworthy AI systems
Innovation

Methods, ideas, or system contributions that make the work stand out.

Systematic literature review using PRISMA guidelines
Structural encoding for insight extraction
User-centered explainability dimensions classification
🔎 Similar Papers
No similar papers found.
Y
Yuhao Zhang
Peking University, China and The University of Oklahoma, USA
Jiaxin An
Jiaxin An
University of Texas at Austin
information behaviorhealth informaticsHCI
Ben Wang
Ben Wang
University of Oklahoma
Y
Yan Zhang
The University of Texas at Austin, USA
J
Jiqun Liu
The University of Oklahoma, USA