NEVO-GSPT: Population-Based Neural Network Evolution Using Inflate and Deflate Operators

📅 2026-01-13
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenges of high computational cost and uncontrolled semantic changes during neural architecture evolution by introducing geometric semantic operators (GSOs) into the evolutionary process. It proposes Inflate/Deflate operations to construct an efficient evolutionary framework capable of effectively exploring the architecture space. A newly designed DGSM operator enables dynamic network size reduction while preserving semantic equivalence, significantly accelerating population training by computing semantics only for newly added components. Evaluated on four regression benchmarks, the method evolves compact yet high-performing networks that match or surpass state-of-the-art approaches, including standard neural networks, SLIM-GSGP, TensorNEAT, and SLM.

Technology Category

Application Category

📝 Abstract
Evolving neural network architectures is a computationally demanding process. Traditional methods often require an extensive search through large architectural spaces and offer limited understanding of how structural modifications influence model behavior. This paper introduces \gls{ngspt}, a novel Neuroevolution algorithm based on two key innovations. First, we adapt geometric semantic operators~(GSOs) from genetic programming to neural network evolution, ensuring that architectural changes produce predictable effects on network semantics within a unimodal error surface. Second, we introduce a novel operator (DGSM) that enables controlled reduction of network size, while maintaining the semantic properties of~GSOs. Unlike traditional approaches, \gls{ngspt}'s efficient evaluation mechanism, which only requires computing the semantics of newly added components, allows for efficient population-based training, resulting in a comprehensive exploration of the search space at a fraction of the computational cost. Experimental results on four regression benchmarks show that \gls{ngspt} consistently evolves compact neural networks that achieve performance comparable to or better than established methods in the literature, such as standard neural networks, SLIM-GSGP, TensorNEAT, and SLM.
Problem

Research questions and friction points this paper is trying to address.

neural architecture evolution
computational efficiency
structural modification
search space exploration
neuroevolution
Innovation

Methods, ideas, or system contributions that make the work stand out.

Neuroevolution
Geometric Semantic Operators
Network Pruning
Population-Based Evolution
Efficient Evaluation
🔎 Similar Papers
No similar papers found.
D
Davide Farinati
NOVA Information Management School (NOVA IMS), Universidade Nova de Lisboa, Campus de Campolide, 1070-312 Lisboa, Portugal
D
Davide Farinati
Vita-Salute San Raffaele University, Milan, Italy Comprehensive Cancer Center/Unit of Urology; URI; IRCCS Ospedale San Raffaele, Milan, Italy
F
Frederico J.J.B. Santos
Department of Engineering and Architecture, University of Trieste, Italy
L
L. Vanneschi
NOVA Information Management School (NOVA IMS), Universidade Nova de Lisboa, Campus de Campolide, 1070-312 Lisboa, Portugal
Mauro Castelli
Mauro Castelli
Full Professor at NOVA IMS, Universidade Nova de Lisboa
Genetic ProgrammingMachine LearningArtificial Intelligence