🤖 AI Summary
This paper investigates the computational complexity of computing the fastest path (i.e., the feasible temporal path with minimum total traversal time) from source to sink in interval temporal graphs. It establishes a fundamental complexity gap between point-based and interval-based models: while the former admits near-linear-time solvability, the latter exhibits an Ω(n²) time lower bound under general conditions (n = number of vertices). Leveraging fine-grained complexity theory—including the Orthogonal Vectors Hypothesis—alongside temporal graph modeling and dynamic programming analysis, we derive a matching O(mT) upper bound algorithm (m = number of edges, T = time horizon). Moreover, we show that imposing zero-delay constraints restores near-linear tractability, yielding an O(m log m) algorithm. Our core contributions are (i) the first tight quadratic-time lower bound for the interval model, and (ii) the first case-optimal algorithmic framework, distinguishing between general and zero-delay settings.
📝 Abstract
Temporal graphs arise when modeling interactions that evolve over time. They usually come in several flavors, depending on the number of parameters used to describe the temporal aspects of the interactions: time of appearance, duration, delay of transmission. In the point model, edges appear at specific points in time, while in the more general interval model, edges can be present over multiple time intervals. In both models, the delay for traversing an edge can change with each edge appearance. When time is discrete, the two models are equivalent in the sense that the presence of an edge during an interval is equivalent to a sequence of point-in-time occurrences of the edge. However, this transformation can drastically change the size of the input and has complexity issues. Indeed, we show a gap between the two models with respect to the complexity of the classical problem of computing a fastest temporal path from a source vertex to a target vertex, i.e. a path where edges can be traversed one after another in time and such that the total duration from source to target is minimized. It can be solved in near-linear time in the point model, while we show that the interval model requires quadratic time under classical assumptions of fine-grained complexity. With respect to linear time, our lower bound implies a factor of the number of vertices, while the best known algorithm has a factor of the number of underlying edges. Interestingly, we show that near-linear time is possible in the interval model when restricted to all delays being zero, i.e. traversing an edge is instantaneous.