Vector Copula Variational Inference and Dependent Block Posterior Approximations

📅 2025-03-03
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Traditional variational inference (VI) suffers from low posterior approximation accuracy in large-scale Bayesian models due to restrictive independence assumptions. To address this, we propose a dependency-aware blockwise posterior approximation method based on vector Copulas, explicitly capturing complex inter-block dependencies among model parameters. Our key innovation integrates learnable autoregressive monotonic neural networks with vector Copulas, yielding a flexible, scalable, and differentiable dependency modeling framework. This framework supports heterogeneous marginal distributions, arbitrary parameter blocking schemes, and diverse dependency patterns. Extensive experiments across four classes of statistical models and sixteen challenging posterior inference datasets demonstrate that our method significantly outperforms both independent-block and factorized-dependency baselines. Crucially, it achieves substantial gains in posterior estimation accuracy while incurring only modest increases in computational overhead.

Technology Category

Application Category

📝 Abstract
Variational inference (VI) is a popular method to estimate statistical and econometric models. The key to VI is the selection of a tractable density to approximate the Bayesian posterior. For large and complex models a common choice is to assume independence between multivariate blocks in a partition of the parameter space. While this simplifies the problem it can reduce accuracy. This paper proposes using vector copulas to capture dependence between the blocks parsimoniously. Tailored multivariate marginals are constructed using learnable cyclically monotone transformations. We call the resulting joint distribution a ``dependent block posterior'' approximation. Vector copula models are suggested that make tractable and flexible variational approximations. They allow for differing marginals, numbers of blocks, block sizes and forms of between block dependence. They also allow for solution of the variational optimization using fast and efficient stochastic gradient methods. The efficacy and versatility of the approach is demonstrated using four different statistical models and 16 datasets which have posteriors that are challenging to approximate. In all cases, our method produces more accurate posterior approximations than benchmark VI methods that either assume block independence or factor-based dependence, at limited additional computational cost.
Problem

Research questions and friction points this paper is trying to address.

Improving variational inference accuracy with vector copulas
Capturing dependence between parameter blocks parsimoniously
Enabling flexible and efficient posterior approximations
Innovation

Methods, ideas, or system contributions that make the work stand out.

Vector copulas model block dependencies parsimoniously.
Learnable cyclically monotone transformations tailor marginals.
Stochastic gradient methods optimize variational approximations efficiently.
🔎 Similar Papers
No similar papers found.