🤖 AI Summary
This work addresses the inefficiencies in natural language processing (NLP) tools for requirements engineering (RE)—notably redundant development, poor interoperability, and limited maintainability—that lead to significant resource waste. To tackle these challenges, the paper presents the first systematic vision and research roadmap for a reference software architecture tailored to the NLP4RE domain. Following established reference architecture development methodologies and stakeholder-driven requirements engineering, the authors conducted focus groups to elicit 36 generic system requirements. This foundational effort aims to transition NLP4RE tools from monolithic systems toward a modular, reusable, and interoperable ecosystem, thereby establishing a solid basis for future architectural designs and sustainable toolchain development.
📝 Abstract
Natural Language Processing (NLP) tools support requirements engineering (RE) tasks like requirements elicitation, classification, and validation. However, they are often developed from scratch despite functional overlaps, and abandoned after publication. This lack of interoperability and maintenance incurs unnecessary development effort, impedes tool comparison and benchmarking, complicates documentation, and diminishes the long-term sustainability of NLP4RE tools. To address these issues, we postulate a vision to transition from monolithic NLP4RE tools to an ecosystem of reusable, interoperable modules. We outline a research roadmap towards a software reference architecture (SRA) to realize this vision, elaborated following a standard methodological framework for SRA development. As an initial step, we conducted a stakeholder-driven focus group session to elicit generic system requirements for NLP4RE tools. This activity resulted in 36 key system requirements, further motivating the need for a dedicated SRA. Overall, the proposed vision, roadmap, and initial contribution pave the way towards improved development, reuse, and long-term maintenance of NLP4RE tools.