Neural Architecture Search for global multi-step Forecasting of Energy Production Time Series

📅 2025-10-27
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Balancing prediction accuracy, computational efficiency, and cross-scenario generalization remains challenging for short-term forecasting in dynamic energy systems. Method: This paper proposes an automated neural architecture search (NAS) framework tailored to energy time series. It introduces a lightweight, time-series-aware search space and a novel multi-objective optimization function jointly penalizing multi-step prediction error, inference latency, and generalization robustness. Crucially, it integrates NAS with ensemble learning to automatically discover and combine optimal lightweight architectures. Results: Extensive experiments demonstrate that the proposed method significantly outperforms state-of-the-art Transformer-based and pre-trained models on global multi-step short-term forecasting tasks. It achieves simultaneous improvements in prediction accuracy, inference speed, and cross-scenario generalization—thereby offering strong practical deployability for real-world energy system applications.

Technology Category

Application Category

📝 Abstract
The dynamic energy sector requires both predictive accuracy and runtime efficiency for short-term forecasting of energy generation under operational constraints, where timely and precise predictions are crucial. The manual configuration of complex methods, which can generate accurate global multi-step predictions without suffering from a computational bottleneck, represents a procedure with significant time requirements and high risk for human-made errors. A further intricacy arises from the temporal dynamics present in energy-related data. Additionally, the generalization to unseen data is imperative for continuously deploying forecasting techniques over time. To overcome these challenges, in this research, we design a neural architecture search (NAS)-based framework for the automated discovery of time series models that strike a balance between computational efficiency, predictive performance, and generalization power for the global, multi-step short-term forecasting of energy production time series. In particular, we introduce a search space consisting only of efficient components, which can capture distinctive patterns of energy time series. Furthermore, we formulate a novel objective function that accounts for performance generalization in temporal context and the maximal exploration of different regions of our high-dimensional search space. The results obtained on energy production time series show that an ensemble of lightweight architectures discovered with NAS outperforms state-of-the-art techniques, such as Transformers, as well as pre-trained forecasting models, in terms of both efficiency and accuracy.
Problem

Research questions and friction points this paper is trying to address.

Automating neural architecture design for energy production time series forecasting
Balancing computational efficiency with predictive accuracy in forecasting
Ensuring model generalization across temporal dynamics in energy data
Innovation

Methods, ideas, or system contributions that make the work stand out.

Automated neural architecture search for energy forecasting
Efficient search space capturing energy time series patterns
Novel objective function ensuring generalization and exploration
🔎 Similar Papers
No similar papers found.