🤖 AI Summary
Balancing prediction accuracy, computational efficiency, and cross-scenario generalization remains challenging for short-term forecasting in dynamic energy systems. Method: This paper proposes an automated neural architecture search (NAS) framework tailored to energy time series. It introduces a lightweight, time-series-aware search space and a novel multi-objective optimization function jointly penalizing multi-step prediction error, inference latency, and generalization robustness. Crucially, it integrates NAS with ensemble learning to automatically discover and combine optimal lightweight architectures. Results: Extensive experiments demonstrate that the proposed method significantly outperforms state-of-the-art Transformer-based and pre-trained models on global multi-step short-term forecasting tasks. It achieves simultaneous improvements in prediction accuracy, inference speed, and cross-scenario generalization—thereby offering strong practical deployability for real-world energy system applications.
📝 Abstract
The dynamic energy sector requires both predictive accuracy and runtime efficiency for short-term forecasting of energy generation under operational constraints, where timely and precise predictions are crucial. The manual configuration of complex methods, which can generate accurate global multi-step predictions without suffering from a computational bottleneck, represents a procedure with significant time requirements and high risk for human-made errors. A further intricacy arises from the temporal dynamics present in energy-related data. Additionally, the generalization to unseen data is imperative for continuously deploying forecasting techniques over time. To overcome these challenges, in this research, we design a neural architecture search (NAS)-based framework for the automated discovery of time series models that strike a balance between computational efficiency, predictive performance, and generalization power for the global, multi-step short-term forecasting of energy production time series. In particular, we introduce a search space consisting only of efficient components, which can capture distinctive patterns of energy time series. Furthermore, we formulate a novel objective function that accounts for performance generalization in temporal context and the maximal exploration of different regions of our high-dimensional search space. The results obtained on energy production time series show that an ensemble of lightweight architectures discovered with NAS outperforms state-of-the-art techniques, such as Transformers, as well as pre-trained forecasting models, in terms of both efficiency and accuracy.