An Online A/B Testing Decision Support System for Web Usability Assessment Based on a Linguistic Decision-making Methodology: Case of Study a Virtual Learning Environment

📅 2025-07-16
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the lack of effective online tools for collaborative usability evaluation involving both real and synthetic users in multi-page web design A/B testing, this paper proposes a usability assessment framework integrating design thinking and linguistic decision theory, implemented as an online decision support system enabling role-playing tests. Methodologically, it combines A/B testing, the System Usability Scale (SUS), synthetic-user role-playing, and multi-criteria decision modeling based on linguistic term sets to enhance quantifiability and interpretability of subjective user experience. Its key contribution is the first integration of linguistic decision methods into the online A/B testing workflow, enabling dynamic, human–machine collaborative usability evaluation. Empirical validation across three Moodle virtual learning environments at the University of Guadalajara, Mexico, demonstrates that the system significantly improves assessment comprehensiveness, feedback interpretability, and cross-design comparative efficacy.

Technology Category

Application Category

📝 Abstract
In recent years, attention has increasingly focused on enhancing user satisfaction with user interfaces, spanning both mobile applications and websites. One fundamental aspect of human-machine interaction is the concept of web usability. In order to assess web usability, the A/B testing technique enables the comparison of data between two designs. Expanding the scope of tests to include the designs being evaluated, in conjunction with the involvement of both real and fictional users, presents a challenge for which few online tools offer support. We propose a methodology for web usability evaluation based on user-centered approaches such as design thinking and linguistic decision-making, named Linguistic Decision-Making for Web Usability Evaluation. This engages people in role-playing scenarios and conducts a number of usability tests, including the widely recognized System Usability Scale. We incorporate the methodology into a decision support system based on A/B testing. We use real users in a case study to assess three Moodle platforms at the University of Guadalajara, Mexico.
Problem

Research questions and friction points this paper is trying to address.

Evaluating web usability via A/B testing with real and fictional users
Developing a linguistic decision-making methodology for usability assessment
Assessing Moodle platforms using user-centered design thinking approaches
Innovation

Methods, ideas, or system contributions that make the work stand out.

Linguistic decision-making for usability evaluation
A/B testing with real and fictional users
System Usability Scale integration
🔎 Similar Papers
No similar papers found.