Robust Estimation for Dependent Binary Network Data

📅 2025-10-25
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the poor robustness of conventional pseudo-likelihood estimation for binary network data modeled by Markov random fields (MRFs) under observation noise or contamination. We propose a robust pseudo-likelihood estimator based on the density power divergence (DPD), the first systematic integration of the DPD framework into parameter learning for dependent binary networks. Theoretically, we establish that the proposed estimator exhibits reduced asymptotic bias sensitivity and enhanced local robustness. It estimates interaction strength parameters by optimizing a DPD-based pseudo-likelihood objective function. Extensive experiments on synthetic data and real-world networks—including social, neural, and genomic networks—demonstrate that our method significantly outperforms standard maximum pseudo-likelihood (MPL) across diverse contamination scenarios: it achieves more accurate structural recovery and superior generalization performance. This work establishes a novel paradigm for network inference in high-noise settings.

Technology Category

Application Category

📝 Abstract
We consider the problem of learning the interaction strength between the nodes of a network based on dependent binary observations residing on these nodes, generated from a Markov Random Field (MRF). Since these observations can possibly be corrupted/noisy in larger networks in practice, it is important to robustly estimate the parameters of the underlying true MRF to account for such inherent contamination in observed data. However, it is well-known that classical likelihood and pseudolikelihood based approaches are highly sensitive to even a small amount of data contamination. So, in this paper, we propose a density power divergence (DPD) based robust generalization of the computationally efficient maximum pseudolikelihood (MPL) estimator of the interaction strength parameter, and derive its rate of consistency under the pure model. Moreover, we show that the gross error sensitivities of the proposed DPD based estimators are significantly smaller than that of the MPL estimator, thereby theoretically justifying the greater (local) robustness of the former under contaminated settings. We also demonstrate the superior (finite sample) performance of the DPD-based variants over the traditional MPL estimator in a number of synthetically generated contaminated network datasets. Finally, we apply our proposed DPD based estimators to learn the network interaction strength in several real datasets from diverse domains of social science, neurobiology and genomics.
Problem

Research questions and friction points this paper is trying to address.

Robustly estimating interaction strength in noisy binary network data
Addressing sensitivity of classical estimators to data contamination
Proposing density power divergence for robust parameter estimation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses density power divergence for robust estimation
Generalizes maximum pseudolikelihood estimator approach
Reduces gross error sensitivity in contaminated data
🔎 Similar Papers
No similar papers found.