🤖 AI Summary
To address information loss and accuracy degradation in State-of-Health (SoH) estimation caused by irregular sampling and variable-length sequences, this paper proposes the Time-aware Dynamic Sequence Inversion Transformer (TIDSIT). TIDSIT directly models raw, non-uniformly spaced time-series data by integrating continuous-time embeddings, a dynamic sequence inversion mechanism, and spatiotemporal attention—eliminating hand-crafted feature engineering and preserving temporal fidelity. Evaluated on the NASA battery dataset, TIDSIT achieves a mean absolute error of only 0.58% in SoH prediction, outperforming existing state-of-the-art models by over 50%. Moreover, it demonstrates superior robustness and generalization across diverse aging patterns and operating conditions. This work establishes a high-accuracy, computationally efficient, and deployable paradigm for online battery aging monitoring in electric vehicles and grid-scale energy storage systems.
📝 Abstract
The rapid adoption of battery-powered vehicles and energy storage systems over the past decade has made battery health monitoring increasingly critical. Batteries play a central role in the efficiency and safety of these systems, yet they inevitably degrade over time due to repeated charge-discharge cycles. This degradation leads to reduced energy efficiency and potential overheating, posing significant safety concerns. Accurate estimation of a State of Health (SoH) of battery is therefore essential for ensuring operational reliability and safety. Several machine learning architectures, such as LSTMs, transformers, and encoder-based models, have been proposed to estimate SoH from discharge cycle data. However, these models struggle with the irregularities inherent in real-world measurements: discharge readings are often recorded at non-uniform intervals, and the lengths of discharge cycles vary significantly. To address this, most existing approaches extract features from the sequences rather than processing them in full, which introduces information loss and compromises accuracy. To overcome these challenges, we propose a novel architecture: Time-Informed Dynamic Sequence Inverted Transformer (TIDSIT). TIDSIT incorporates continuous time embeddings to effectively represent irregularly sampled data and utilizes padded sequences with temporal attention mechanisms to manage variable-length inputs without discarding sequence information. Experimental results on the NASA battery degradation dataset show that TIDSIT significantly outperforms existing models, achieving over 50% reduction in prediction error and maintaining an SoH prediction error below 0.58%. Furthermore, the architecture is generalizable and holds promise for broader applications in health monitoring tasks involving irregular time-series data.