Fast-Powerformer: A Memory-Efficient Transformer for Accurate Mid-Term Wind Power Forecasting

📅 2025-04-15
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the challenge of simultaneously achieving high accuracy and low computational overhead in medium-term wind power forecasting, this paper proposes Fast-Powerformer—a lightweight and efficient Transformer-based model. Methodologically, it introduces a novel input transposition mechanism tailored to the temporal characteristics of wind power series; designs a frequency-enhanced channel attention module (FECAM) to improve meteorological feature modeling; and integrates lightweight LSTM embeddings with Reformer’s locality-sensitive hashing (LSH) attention to substantially reduce memory footprint and computational complexity. Extensive experiments on multiple real-world wind farm datasets demonstrate that Fast-Powerformer outperforms state-of-the-art methods in prediction accuracy, achieves a 3.2× speedup in inference latency, and reduces GPU memory consumption by 57%, thereby enabling robust real-time deployment.

Technology Category

Application Category

📝 Abstract
Wind power forecasting (WPF), as a significant research topic within renewable energy, plays a crucial role in enhancing the security, stability, and economic operation of power grids. However, due to the high stochasticity of meteorological factors (e.g., wind speed) and significant fluctuations in wind power output, mid-term wind power forecasting faces a dual challenge of maintaining high accuracy and computational efficiency. To address these issues, this paper proposes an efficient and lightweight mid-term wind power forecasting model, termed Fast-Powerformer. The proposed model is built upon the Reformer architecture, incorporating structural enhancements such as a lightweight Long Short-Term Memory (LSTM) embedding module, an input transposition mechanism, and a Frequency Enhanced Channel Attention Mechanism (FECAM). These improvements enable the model to strengthen temporal feature extraction, optimize dependency modeling across variables, significantly reduce computational complexity, and enhance sensitivity to periodic patterns and dominant frequency components. Experimental results conducted on multiple real-world wind farm datasets demonstrate that the proposed Fast-Powerformer achieves superior prediction accuracy and operational efficiency compared to mainstream forecasting approaches. Furthermore, the model exhibits fast inference speed and low memory consumption, highlighting its considerable practical value for real-world deployment scenarios.
Problem

Research questions and friction points this paper is trying to address.

Addresses mid-term wind power forecasting accuracy and efficiency
Reduces computational complexity in transformer-based forecasting models
Enhances sensitivity to periodic patterns in wind data
Innovation

Methods, ideas, or system contributions that make the work stand out.

Lightweight LSTM embedding module
Input transposition mechanism
Frequency Enhanced Channel Attention Mechanism
🔎 Similar Papers
No similar papers found.
M
Mingyi Zhu
Department of Artificial Intelligence and Automation, School of Electrical Engineering and Automation, Wuhan University, Wuhan 430072, China
Zhaoxin Li
Zhaoxin Li
Georgia Institute of Technology
Robot LearningExplainable Artificial Intelligence
Qiao Lin
Qiao Lin
Department of Artificial Intelligence and Automation, School of Electrical Engineering and Automation, Wuhan University, Wuhan 430072, China
L
Li Ding
Department of Artificial Intelligence and Automation, School of Electrical Engineering and Automation, Wuhan University, Wuhan 430072, China