🤖 AI Summary
This work investigates the minimum number of local advice bits per node required to solve locally checkable labeling (LCL) problems in distributed computing within $f(Delta)$ rounds—where $f$ depends only on the graph’s maximum degree $Delta$. We introduce the paradigm of *advice-augmented local algorithms* and establish, for the first time, that on subexponential-growth graphs, almost all LCL problems admit efficient solutions using merely one arbitrary sparse bit of advice per node; we further prove this bound is tight. Building upon a novel theoretical framework for local edge compression and decompression, we achieve a compact encoding of $d/2 + 2$ bits per node. This framework unifies efficient solutions to classical problems—including nearly balanced orientation, $Delta$-coloring, and $3$-coloring—on subexponential graphs. Our results break the implicit assumption in classical LCL complexity lower bounds regarding advice requirements, opening a new pathway for solving graph problems under low communication overhead.
📝 Abstract
Algorithms with advice have received ample attention in the distributed and online settings, and they have recently proven useful also in dynamic settings. In this work we study local computation with advice: the goal is to solve a graph problem $Pi$ with a distributed algorithm in $f(Delta)$ communication rounds, for some function $f$ that only depends on the maximum degree $Delta$ of the graph, and the key question is how many bits of advice per node are needed. Our main results are: - Any locally checkable labeling problem can be solved in graphs with sub-exponential growth with only $1$ bit of advice per node. Moreover, we can make the set of nodes that carry advice bits arbitrarily sparse, that is, we can make arbitrarily small the ratio between nodes carrying a 1 and the nodes carrying a 0. - The assumption of sub-exponential growth is necessary: assuming the Exponential-Time Hypothesis, there are LCLs that cannot be solved in general with any constant number of bits per node. - In any graph we can find an almost-balanced orientation (indegrees and outdegrees differ by at most one) with $1$ bit of advice per node, and again we can make the advice arbitrarily sparse. - As a corollary, we can also compress an arbitrary subset of edges so that a node of degree $d$ stores only $d/2 + 2$ bits, and we can decompress it locally, in $f(Delta)$ rounds. - In any graph of maximum degree $Delta$, we can find a $Delta$-coloring (if it exists) with $1$ bit of advice per node, and again, we can make the advice arbitrarily sparse. - In any $3$-colorable graph, we can find a $3$-coloring with $1$ bit of advice per node. Here, it remains open whether we can make the advice arbitrarily sparse. Our work shows that for many problems the key threshold is not whether we can achieve, say, $1$ bit of advice per node, but whether we can make the advice arbitrarily sparse.