Estimating Network Models using Neural Networks

📅 2025-02-03
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
ERGMs suffer from intractable normalizing constants, rendering conventional MCMC-based maximum likelihood estimation computationally expensive and inherently sequential. To address this, we propose the first end-to-end, deep invertible neural network framework that directly learns a bijective mapping between model parameters and network sufficient statistics—bypassing MCMC sampling entirely. Our method employs supervised learning to jointly train both forward (parameter → statistic) and inverse (statistic → parameter) mappings, accommodating arbitrary differentiable statistics and improving robustness to model misspecification. Evaluated across diverse network datasets, it achieves estimation accuracy comparable to MCMC-MLE while reducing inference latency by one to two orders of magnitude, enabling real-time and batch inference. The core contribution is the first application of invertible neural networks to ERGM estimation, establishing a sampling-free, scalable, and interpretable paradigm. (149 words)

Technology Category

Application Category

📝 Abstract
Exponential random graph models (ERGMs) are very flexible for modeling network formation but pose difficult estimation challenges due to their intractable normalizing constant. Existing methods, such as MCMC-MLE, rely on sequential simulation at every optimization step. We propose a neural network approach that trains on a single, large set of parameter-simulation pairs to learn the mapping from parameters to average network statistics. Once trained, this map can be inverted, yielding a fast and parallelizable estimation method. The procedure also accommodates extra network statistics to mitigate model misspecification. Some simple illustrative examples show that the method performs well in practice.
Problem

Research questions and friction points this paper is trying to address.

Estimating network formation models efficiently
Overcoming intractable normalizing constants in ERGMs
Using neural networks for fast, parallelizable estimation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Neural network trains parameter-simulation pairs
Inverts map for fast parallelizable estimation
Accommodates extra network statistics
🔎 Similar Papers
No similar papers found.