🤖 AI Summary
Existing theoretical characterizations of graph neural network (GNN) expressivity are restricted to discrete-time, connected graph snapshots, failing to model real-world asynchronous, intermittently disconnected dynamic systems—e.g., communication or financial transaction networks.
Method: We generalize the 1-WL test to attributed continuous-time dynamic graphs (CT-DGs) and establish its equivalence to dynamic unfolding trees. Building on this, we propose the Compact Continuous-Time GNN (CGNN) family: it employs piecewise continuously differentiable temporal functions and an event-driven mechanism to uniformly model disconnected and asynchronous graphs.
Contribution/Results: We rigorously prove that CGNNs achieve expressivity equivalent to the continuous-time dynamic 1-WL test—ensuring both universal approximation capability and computational efficiency. This work provides the first theoretically sound and practically viable expressivity framework for learning on continuous-time dynamic graphs.
📝 Abstract
Graph Neural Networks (GNNs) are known to match the distinguishing power of the 1-Weisfeiler-Lehman (1-WL) test, and the resulting partitions coincide with the unfolding tree equivalence classes of graphs. Preserving this equivalence, GNNs can universally approximate any target function on graphs in probability up to any precision. However, these results are limited to attributed discrete-dynamic graphs represented as sequences of connected graph snapshots. Real-world systems, such as communication networks, financial transaction networks, and molecular interactions, evolve asynchronously and may split into disconnected components. In this paper, we extend the theory of attributed discrete-dynamic graphs to attributed continuous-time dynamic graphs with arbitrary connectivity. To this end, we introduce a continuous-time dynamic 1-WL test, prove its equivalence to continuous-time dynamic unfolding trees, and identify a class of continuous-time dynamic GNNs (CGNNs) based on discrete-dynamic GNN architectures that retain both distinguishing power and universal approximation guarantees. Our constructive proofs further yield practical design guidelines, emphasizing a compact and expressive CGNN architecture with piece-wise continuously differentiable temporal functions to process asynchronous, disconnected graphs.