🤖 AI Summary
This paper addresses parameter inference for directed networks under edge-level local differential privacy (LDP). We propose the first edge-flip mechanism for private directed graph release and introduce a moment-based estimator leveraging noisy bidirectional degree sequences to achieve asymptotically consistent and asymptotically normal estimation of unknown parameters in the $p_0$-exponential family model. Theoretically, we establish statistical consistency and asymptotic normality of the estimator under LDP constraints. Numerical experiments demonstrate that our method significantly outperforms classical output perturbation in estimation accuracy. Empirical analysis on the UCI message network further validates its effectiveness and practical utility. Our key contributions are: (1) the first edge-level LDP mechanism tailored to directed graphs; and (2) a low-variance moment estimator designed specifically for bidirectional degree structure, achieving a favorable trade-off between privacy preservation and statistical efficiency.
📝 Abstract
We explore the edge-flipping mechanism, a type of input perturbation, to release the directed graph under edge-local differential privacy. By using the noisy bi-degree sequence from the output graph, we construct the moment equations to estimate the unknown parameters in the $p_0$ model, which is an exponential family distribution with the bi-degree sequence as the natural sufficient statistic. We show that the resulting private estimator is asymptotically consistent and normally distributed under some conditions. In addition, we compare the performance of input and output perturbation mechanisms for releasing bi-degree sequences in terms of parameter estimation accuracy and privacy protection. Numerical studies demonstrate our theoretical findings and compare the performance of the private estimates obtained by different types of perturbation methods. We apply the proposed method to analyze the UC Irvine message network.