Usability Evaluation and Improvement of a Tool for Self-Service Learning Analytics

📅 2026-03-25
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study addresses the limited adoption of self-service learning analytics (SSLA) tools by non-technical educators due to poor usability. Focusing on “Indicator Editor,” a no-code, exploratory SSLA tool, the research employed an iterative evaluation approach—combining high-fidelity prototyping, qualitative user studies, and standardized metrics such as SUS, UEQ, and NPS—to refine its workflow guidance, feedback mechanisms, and information presentation. Validated through real-world teaching scenarios, the work distills a set of usability design principles tailored for non-technical users, substantially enhancing both usability and user experience. These principles offer a reusable design paradigm and practical guidance for developing accessible SSLA systems in educational contexts.

Technology Category

Application Category

📝 Abstract
Self-Service Learning Analytics (SSLA) tools aim to support educational stakeholders in creating learning analytics indicators without requiring technical expertise. While such tools promise user control and trans- parency, their effectiveness and adoption depend critically on usability aspects. This paper presents a compre- hensive usability evaluation and improvement of the Indicator Editor, a no-code, exploratory SSLA tool that enables non-technical users to implement custom learning analytics indicators through a structured workflow. Using an iterative evaluation approach, we conduct an exploratory qualitative user study, usability inspections of high-fidelity prototypes, and a workshop-based evaluation in an authentic educational setting with n = 46 students using standardized instruments, namely System Usability Scale (SUS), User Experience Question- naire (UEQ), and Net Promoter Score (NPS). Based on the evaluation findings, we derive concrete design implications that inform improvements in workflow guidance, feedback, and information presentation in the Indicator Editor. Furthermore, our evaluation provides practical insights for the design of usable SSLA tools.
Problem

Research questions and friction points this paper is trying to address.

Self-Service Learning Analytics
Usability
No-code Tool
Learning Analytics
User Experience
Innovation

Methods, ideas, or system contributions that make the work stand out.

Self-Service Learning Analytics
Usability Evaluation
No-Code Tool
User-Centered Design
Learning Analytics Indicator
🔎 Similar Papers
No similar papers found.