Human-Centered Development of Indicators for Self-Service Learning Analytics: A Transparency through Exploration Approach

📅 2025-10-09
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Learning analytics often suffers from low user trust and intervention acceptance due to opaque reasoning processes. To address this, we propose a novel “transparency-through-exploration” paradigm, developed via iterative human-centered design (n=15), resulting in a self-service metric editor that enables end-users—particularly instructors—to interactively construct, inspect, and refine analytical metrics. This tool grants users direct agency over both the logic and generation process of learning metrics. Empirical evaluation demonstrates significant improvements in system transparency, user trust, satisfaction, and adoption willingness. Our key contribution is the first systematic integration of exploratory metric construction into learning analytics practice, shifting transparency from post-hoc explanation to real-time, participatory engagement through an operationalizable tool design. This advances human-centered, trustworthy, and usable learning analytics.

Technology Category

Application Category

📝 Abstract
The aim of learning analytics is to turn educational data into insights, decisions, and actions to improve learning and teaching. The reasoning of the provided insights, decisions, and actions is often not transparent to the end-user, and this can lead to trust and acceptance issues when interventions, feedback, and recommendations fail. In this paper, we shed light on achieving transparent learning analytics by following a transparency through exploration approach. To this end, we present the design, implementation, and evaluation details of the Indicator Editor, which aims to support self-service learning analytics by empowering end-users to take control of the indicator implementation process. We systematically designed and implemented the Indicator Editor through an iterative human-centered design (HCD) approach. Further, we conducted a qualitative user study (n=15) to investigate the impact of following a self-service learning analytics approach on the users'perception of and interaction with the Indicator Editor. Our study showed qualitative evidence that supporting user interaction and providing user control in the indicator implementation process can have positive effects on different crucial aspects of learning analytics, namely transparency, trust, satisfaction, and acceptance.
Problem

Research questions and friction points this paper is trying to address.

Achieving transparent learning analytics through user exploration approach
Empowering end-users to control indicator implementation process
Addressing trust and acceptance issues in learning analytics systems
Innovation

Methods, ideas, or system contributions that make the work stand out.

Human-centered design for self-service learning analytics
Indicator Editor empowers user-controlled implementation process
Transparency through exploration approach enhances user trust
🔎 Similar Papers
No similar papers found.