🤖 AI Summary
Standard LSTM networks lack input-to-state stability (ISS) guarantees, limiting their reliability in modeling nonlinear thermal systems. Method: This paper establishes the first sufficient condition for ISS with respect to the infinity norm (ISS∞) for LSTMs—requiring fewer parameter dependencies and enabling more concise stability analysis. Building upon this, we propose an ISS∞-constrained structured LSTM architecture, a stability-weighted loss function, and an adaptive early-stopping mechanism. Contribution/Results: Evaluated on data-driven thermal system modeling tasks, the ISS∞-LSTM achieves significantly higher prediction accuracy than standard LSTM, GRU, physics-based models, and even ISS∞-GRU. These results empirically validate the synergistic benefit of embedding ISS∞ constraints into deep learning architectures. The work provides both theoretical foundations and a practical framework for trustworthy, stability-guaranteed dynamic modeling with deep neural networks.
📝 Abstract
Recurrent Neural Networks (RNNs) have shown remarkable performances in system identification, particularly in nonlinear dynamical systems such as thermal processes. However, stability remains a critical challenge in practical applications: although the underlying process may be intrinsically stable, there may be no guarantee that the resulting RNN model captures this behavior. This paper addresses the stability issue by deriving a sufficient condition for Input-to-State Stability based on the infinity-norm (ISS$_{infty}$) for Long Short-Term Memory (LSTM) networks. The obtained condition depends on fewer network parameters compared to prior works. A ISS$_{infty}$-promoted training strategy is developed, incorporating a penalty term in the loss function that encourages stability and an ad hoc early stopping approach. The quality of LSTM models trained via the proposed approach is validated on a thermal system case study, where the ISS$_{infty}$-promoted LSTM outperforms both a physics-based model and an ISS$_{infty}$-promoted Gated Recurrent Unit (GRU) network while also surpassing non-ISS$_{infty}$-promoted LSTM and GRU RNNs.