🤖 AI Summary
This paper addresses the problem of generating high-fidelity synthetic graph networks under differential privacy. We propose the first ε-differentially private graph generation framework that employs the fused Gromov–Wasserstein (FGW) distance as the utility metric, jointly modeling both graph structure and node attributes while providing vertex-level privacy guarantees for unified structural and attribute synthesis. Our method builds upon a stochastic block model and incorporates a PSMM-inspired heuristic optimization, ensuring strict ε-differential privacy and establishing a theoretical convergence bound on FGW error. Experiments on multiple real-world networks demonstrate that our synthetic graphs achieve 32–58% lower FGW distance than state-of-the-art baselines, significantly improving structural fidelity. Moreover, the derived theoretical error bound converges with increasing graph size—marking the first approach to simultaneously optimize privacy preservation, practical utility, and provable guarantees.
📝 Abstract
Networks are popular for representing complex data. In particular, differentially private synthetic networks are much in demand for method and algorithm development. The network generator should be easy to implement and should come with theoretical guarantees. Here we start with complex data as input and jointly provide a network representation as well as a synthetic network generator. Using a random connection model, we devise an effective algorithmic approach for generating attributed synthetic graphs which is $epsilon$-differentially private at the vertex level, while preserving utility under an appropriate notion of distance which we develop. We provide theoretical guarantees for the accuracy of the private synthetic graphs using the fused Gromov-Wasserstein distance, which extends the Wasserstein metric to structured data. Our method draws inspiration from the PSMM method of citet{he2023}.