🤖 AI Summary
This work addresses unlabeled graph compression when structured side information—specifically, a graph—is available only at the decoder, under Erdős–Rényi (weighted or unweighted) random graph models. To overcome the limitation of conventional methods that ignore graph structural equivalence, we first establish a convergence theory for the conditional distribution of graph structure, proving that highly correlated structural side information is nearly sufficient. We then propose an optimal decoding framework based on graph alignment, enabling structure-aware exact reconstruction. By integrating entropy spectral analysis with equivalence-class modeling, we rigorously characterize the minimum achievable compression rate and construct an explicit encoder–decoder scheme attaining this fundamental limit. Our results provide a novel paradigm and tight rate bounds for unsupervised image/graph compression driven by structured side information.
📝 Abstract
In this paper, we study the problem of graph compression with side information at the decoder. The focus is on the situation when an unlabelled graph (which is also referred to as a structure) is to be compressed or is available as side information. For correlated ErdH{o}s-R'enyi weighted random graphs, we give a precise characterization of the smallest rate at which a labelled graph or its structure can be compressed with aid of a correlated labelled graph or its structure at the decoder. We approach this problem by using the entropy-spectrum framework and establish some convergence results for conditional distributions involving structures, which play a key role in the construction of an optimal encoding and decoding scheme. Our proof essentially uses the fact that, in the considered correlated ErdH{o}s-R'enyi model, the structure retains most of the information about the labelled graph. Furthermore, we consider the case of unweighted graphs and present how the optimal decoding can be done using the notion of graph alignment.