🤖 AI Summary
Existing graph neural network (GNN)-based approaches for multivariate time series forecasting suffer from two key limitations: (1) neighbor aggregation overlooks information diversity, introducing redundancy; and (2) reliance on a single temporal scale impedes adaptive integration of multi-granularity time-series features. To address these issues, we propose DIMIGNN—a Diversity-aware, Multi-scale Graph Neural Network. Its core innovations are: (1) a diversity-aware neighbor selection mechanism that jointly leverages similarity constraints and information entropy to balance neighborhood relevance and heterogeneity; and (2) a dynamic multi-scale fusion module that adaptively integrates outputs from multi-scale temporal convolutions via attention-driven weight learning. Extensive experiments on multiple real-world datasets demonstrate that DIMIGNN consistently outperforms state-of-the-art baselines in forecasting accuracy, achieving robust improvements in modeling both inter-variable dependencies and multi-granularity temporal dynamics.
📝 Abstract
Recently, numerous deep models have been proposed to enhance the performance of multivariate time series (MTS) forecasting. Among them, Graph Neural Networks (GNNs)-based methods have shown great potential due to their capability to explicitly model inter-variable dependencies. However, these methods often overlook the diversity of information among neighbors, which may lead to redundant information aggregation. In addition, their final prediction typically relies solely on the representation from a single temporal scale. To tackle these issues, we propose a Graph Neural Networks (GNNs) with Diversity-aware Neighbor Selection and Dynamic Multi-scale Fusion (DIMIGNN). DIMIGNN introduces a Diversity-aware Neighbor Selection Mechanism (DNSM) to ensure that each variable shares high informational similarity with its neighbors while maintaining diversity among neighbors themselves. Furthermore, a Dynamic Multi-Scale Fusion Module (DMFM) is introduced to dynamically adjust the contributions of prediction results from different temporal scales to the final forecasting result. Extensive experiments on real-world datasets demonstrate that DIMIGNN consistently outperforms prior methods.