🤖 AI Summary
This study addresses the tendency of current AI assistants to induce cognitive passivity in data literacy education, thereby suppressing users’ active critical thinking. To counter this issue, the authors propose a “cognitive alignment” framework that leverages cognitive science theories to construct a human-AI interaction classification model. This model dynamically matches users’ cognitive needs—categorized as receptive or critical—with corresponding AI interaction modes—transmissive or dialogic. Moving beyond traditional static assistance paradigms, the framework conceptualizes human-AI collaboration as a dynamic process of aligning user cognition with AI behavior. It further elucidates the underlying mechanisms through which cognitive misalignment leads to either passive engagement or interactional friction, offering both theoretical grounding and actionable design principles for intelligent educational systems.
📝 Abstract
AI chatbots are increasingly stepping into roles as collaborators or teachers in analyzing, visualizing, and reasoning through data and domain problem. Yet, AI's default assistant mode with its comprehensive and one-off responses may undermine opportunities for practitioners to develop literacy through their own thinking, inducing cognitive passivity. Drawing on evidence from empirical studies and theories, we argue that disrupting cognitive passivity necessitates a nuanced approach: rather than simply making AI promote deliberative thinking, there is a need for more dynamic and adaptive strategy through cognitive alignment -- a framework that characterizes effective human-AI interaction as a function of alignment between users' cognitive demand and AI's interaction mode. In the framework, we provide the mapping between AI's interaction mode (transmissive or deliberative) and users' cognitive demand (receptive or deliberative), otherwise leading to either cognitive passivity or friction. We further discuss implications and offer open questions for future research on data literacy.