HGTS-Former: Hierarchical HyperGraph Transformer for Multivariate Time Series Analysis

πŸ“… 2025-08-04
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
To address the challenges of modeling dynamic evolution and high-order inter-variable couplings in high-dimensional multivariate time series, this paper proposes HGTS-Formerβ€”the first hierarchical Transformer-based model integrating hypergraph structural priors. Its core innovation lies in constructing a dynamic hierarchical hypergraph that explicitly captures intra-channel temporal patterns and fine-grained, high-order cross-variable dependencies; an EdgeToNode module is further introduced to enable learnable mapping from hyperedge features to node representations. The architecture integrates time-block embeddings, multi-head self-attention, and feed-forward enhancement. Extensive experiments across two task categories and eight benchmark datasets demonstrate that HGTS-Former significantly outperforms state-of-the-art methods, validating its superior capability in modeling complex temporal dependencies and strong generalizability.

Technology Category

Application Category

πŸ“ Abstract
Multivariate time series analysis has long been one of the key research topics in the field of artificial intelligence. However, analyzing complex time series data remains a challenging and unresolved problem due to its high dimensionality, dynamic nature, and complex interactions among variables. Inspired by the strong structural modeling capability of hypergraphs, this paper proposes a novel hypergraph-based time series transformer backbone network, termed HGTS-Former, to address the multivariate coupling in time series data. Specifically, given the multivariate time series signal, we first normalize and embed each patch into tokens. Then, we adopt the multi-head self-attention to enhance the temporal representation of each patch. The hierarchical hypergraphs are constructed to aggregate the temporal patterns within each channel and fine-grained relations between different variables. After that, we convert the hyperedge into node features through the EdgeToNode module and adopt the feed-forward network to further enhance the output features. Extensive experiments conducted on two multivariate time series tasks and eight datasets fully validated the effectiveness of our proposed HGTS-Former. The source code will be released on https://github.com/Event-AHU/Time_Series_Analysis.
Problem

Research questions and friction points this paper is trying to address.

Analyzing high-dimensional dynamic multivariate time series
Modeling complex interactions among time series variables
Enhancing temporal representation with hierarchical hypergraphs
Innovation

Methods, ideas, or system contributions that make the work stand out.

Hierarchical hypergraphs model complex variable relations
Multi-head self-attention enhances temporal representation
EdgeToNode converts hyperedge features into nodes
πŸ”Ž Similar Papers
No similar papers found.
X
Xiao Wang
School of Computer Science and Technology, Anhui University, Hefei 230601, China
H
Hao Si
School of Computer Science and Technology, Anhui University, Hefei 230601, China
F
Fan Zhang
School of Computer Science and Technology, Anhui University, Hefei 230601, China
X
Xiaoya Zhou
School of Computer Science and Technology, Anhui University, Hefei 230601, China
Dengdi Sun
Dengdi Sun
Anhui University
Machine LearningComputer Vision
W
Wanli Lyu
School of Computer Science and Technology, Anhui University, Hefei 230601, China
Q
Qingquan Yang
Institute of Plasma Physics, Chinese Academy of Sciences, Hefei, China
Jin Tang
Jin Tang
Anhui University
Computer visionintelligent video analysis