Artificial intelligence for context-aware visual change detection in software test automation

📅 2024-05-01
🏛️ arXiv.org
📈 Citations: 2
Influential: 0
📄 PDF
🤖 AI Summary
Existing pixel-level and region-level approaches for UI visual regression testing struggle to model semantic context, spatial relationships, and subtle visual changes. To address this, we propose a graph neural network (GNN)-driven, context-aware visual change detection method. Our core contribution is the first explicit modeling of UI interfaces as structured graphs: nodes represent detected UI controls, while edges encode their spatial and semantic relationships. Leveraging graph alignment and multi-scale GNN inference, our method achieves precise cross-version control matching and fine-grained change localization. Experimental evaluation across diverse, complex UI scenarios demonstrates over 35% improvement in accuracy compared to conventional baselines. The approach has been successfully integrated into real-world software iterative testing pipelines, effectively overcoming the semantic understanding limitation inherent in purely pixel-based visual comparison techniques.

Technology Category

Application Category

📝 Abstract
Automated software testing is integral to the software development process, streamlining workflows and ensuring product reliability. Visual testing within this context, especially concerning user interface (UI) and user experience (UX) validation, stands as one of crucial determinants of overall software quality. Nevertheless, conventional methods like pixel-wise comparison and region-based visual change detection fall short in capturing contextual similarities, nuanced alterations, and understanding the spatial relationships between UI elements. In this paper, we introduce a novel graph-based method for visual change detection in software test automation. Leveraging a machine learning model, our method accurately identifies UI controls from software screenshots and constructs a graph representing contextual and spatial relationships between the controls. This information is then used to find correspondence between UI controls within screenshots of different versions of a software. The resulting graph encapsulates the intricate layout of the UI and underlying contextual relations, providing a holistic and context-aware model. This model is finally used to detect and highlight visual regressions in the UI. Comprehensive experiments on different datasets showed that our change detector can accurately detect visual software changes in various simple and complex test scenarios. Moreover, it outperformed pixel-wise comparison and region-based baselines by a large margin in more complex testing scenarios. This work not only contributes to the advancement of visual change detection but also holds practical implications, offering a robust solution for real-world software test automation challenges, enhancing reliability, and ensuring the seamless evolution of software interfaces.
Problem

Research questions and friction points this paper is trying to address.

Detecting contextual UI changes in software test automation
Overcoming limitations of pixel-wise visual comparison methods
Modeling spatial relationships between interface elements using graphs
Innovation

Methods, ideas, or system contributions that make the work stand out.

Graph-based approach models UI element relationships
Machine learning detects UI controls from screenshots
Recursive similarity computation combines multiple change cues
🔎 Similar Papers
No similar papers found.
M
M. Moradi
AI Research, Tricentis, Vienna, Austria
K
Ke Yan
AI Research, Tricentis, Sydney, Australia
David Colwell
David Colwell
Senior Lecturer, School of Banking and Finance, the University of New South Wales
mathematical financecontinuous-time financederivatives pricing
R
Rhona Asgari
AI Research, Tricentis, Vienna, Austria