🤖 AI Summary
Learning analytics often suffers from low user trust and intervention acceptance due to opaque reasoning processes. To address this, we propose a novel “transparency-through-exploration” paradigm, developed via iterative human-centered design (n=15), resulting in a self-service metric editor that enables end-users—particularly instructors—to interactively construct, inspect, and refine analytical metrics. This tool grants users direct agency over both the logic and generation process of learning metrics. Empirical evaluation demonstrates significant improvements in system transparency, user trust, satisfaction, and adoption willingness. Our key contribution is the first systematic integration of exploratory metric construction into learning analytics practice, shifting transparency from post-hoc explanation to real-time, participatory engagement through an operationalizable tool design. This advances human-centered, trustworthy, and usable learning analytics.
📝 Abstract
The aim of learning analytics is to turn educational data into insights, decisions, and actions to improve learning and teaching. The reasoning of the provided insights, decisions, and actions is often not transparent to the end-user, and this can lead to trust and acceptance issues when interventions, feedback, and recommendations fail. In this paper, we shed light on achieving transparent learning analytics by following a transparency through exploration approach. To this end, we present the design, implementation, and evaluation details of the Indicator Editor, which aims to support self-service learning analytics by empowering end-users to take control of the indicator implementation process. We systematically designed and implemented the Indicator Editor through an iterative human-centered design (HCD) approach. Further, we conducted a qualitative user study (n=15) to investigate the impact of following a self-service learning analytics approach on the users'perception of and interaction with the Indicator Editor. Our study showed qualitative evidence that supporting user interaction and providing user control in the indicator implementation process can have positive effects on different crucial aspects of learning analytics, namely transparency, trust, satisfaction, and acceptance.