🤖 AI Summary
Most existing traffic flow forecasting methods rely on the Markov assumption, jointly modeling flow generation and transition processes, thereby neglecting the non-Markovian nature and multi-periodic generation characteristics of node-level traffic. To address this, we propose, for the first time, a generation–transition dual-path paradigm that explicitly decouples node-level flow generation (non-Markovian, multi-periodic) from graph-level flow transition (spatiotemporal interaction). We design EMBSFormer—a lightweight, efficient architecture featuring a multi-branch similarity analysis module to dynamically capture multi-periodic generation patterns, integrated with spatiotemporal self-attention, graph neural networks, and temporal convolutions. Evaluated on three real-world datasets, our method achieves state-of-the-art performance in both short-term and long-term forecasting, with only 93K parameters—18% of GMAN’s parameter count—demonstrating superior accuracy-efficiency trade-off.
📝 Abstract
Traffic flow prediction plays an important role in Intelligent Transportation Systems in traffic management and urban planning. There have been extensive successful works in this area. However, these approaches focus only on modelling the flow transition and ignore the flow generation process, which manifests itself in two ways: (i) The models are based on Markovian assumptions, ignoring the multi-periodicity of the flow generation in nodes. (ii) The same structure is designed to encode both the transition and generation processes, ignoring the differences between them. To address these problems, we propose an Effective Multi-Branch Similarity Transformer for Traffic Flow Prediction, namely EMBSFormer. Through data analysis, we find that the factors affecting traffic flow include node-level traffic generation and graph-level traffic transition, which describe the multi-periodicity and interaction pattern of nodes, respectively. Specifically, to capture traffic generation patterns, we propose a similarity analysis module that supports multi-branch encoding to dynamically expand significant cycles. For traffic transition, we employ a temporal and spatial self-attention mechanism to maintain global node interactions, and use GNN and time conv to model local node interactions, respectively. Model performance is evaluated on three real-world datasets on both long-term and short-term prediction tasks. Experimental results show that EMBSFormer outperforms baselines on both tasks. Moreover, compared to models based on flow transition modelling (e.g. GMAN, 513k), the variant of EMBSFormer(93K) only uses 18% of the parameters, achieving the same performance.