What Do People Want to Know About Artificial Intelligence (AI)? The Importance of Answering End-User Questions to Explain Autonomous Vehicle (AV) Decisions

📅 2025-05-09
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing AI explanation mechanisms are predominantly developer-centric, failing to address end-users’—particularly passengers’—need for genuine comprehension of autonomous vehicle (AV) decision-making. Method: This work first systematically identifies and validates 27 previously unaddressed explanation requirements specific to passengers, then proposes a user-question-driven interactive textual explanation paradigm to bridge the gap between engineering-oriented explanations and public understanding. Through a two-stage user study (open-ended survey followed by controlled experiments), integrating qualitative thematic coding and quantitative comprehension assessment, the paradigm’s efficacy is rigorously evaluated. Contribution/Results: Results demonstrate statistically significant improvements in passengers’ comprehension of AV decisions (p < 0.01), alongside enhanced explanation effectiveness and user willingness to engage interactively. This study establishes both theoretical foundations and practical design guidelines for human-centered, trustworthy AI explanation systems.

Technology Category

Application Category

📝 Abstract
Improving end-users' understanding of decisions made by autonomous vehicles (AVs) driven by artificial intelligence (AI) can improve utilization and acceptance of AVs. However, current explanation mechanisms primarily help AI researchers and engineers in debugging and monitoring their AI systems, and may not address the specific questions of end-users, such as passengers, about AVs in various scenarios. In this paper, we conducted two user studies to investigate questions that potential AV passengers might pose while riding in an AV and evaluate how well answers to those questions improve their understanding of AI-driven AV decisions. Our initial formative study identified a range of questions about AI in autonomous driving that existing explanation mechanisms do not readily address. Our second study demonstrated that interactive text-based explanations effectively improved participants' comprehension of AV decisions compared to simply observing AV decisions. These findings inform the design of interactions that motivate end-users to engage with and inquire about the reasoning behind AI-driven AV decisions.
Problem

Research questions and friction points this paper is trying to address.

Understanding end-user questions about AI-driven AV decisions
Addressing gaps in current explanation mechanisms for AV passengers
Improving AV decision comprehension via interactive text-based explanations
Innovation

Methods, ideas, or system contributions that make the work stand out.

Interactive text-based explanations for AV decisions
User studies to identify end-user questions on AI
Improved comprehension through tailored AI explanations
🔎 Similar Papers
No similar papers found.