Meta-probabilistic Modeling

📅 2026-01-08
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work proposes Meta Probabilistic Modeling (MPM), a novel framework that integrates meta-learning into probabilistic graphical models to address the limitations of manually designed structures, which struggle to generalize across multiple tasks or datasets. MPM employs a hierarchical architecture to automatically discover a global structural prior shared across datasets while adapting local parameters to new data. The approach leverages a VAE-inspired surrogate objective and a bilevel optimization strategy: local variables are updated analytically via coordinate ascent, while global structural parameters are optimized through gradient-based methods. Experiments demonstrate that MPM successfully recovers interpretable latent structures in both object-centric image and sequential text modeling, effectively adapting to diverse generative requirements across datasets.

Technology Category

Application Category

📝 Abstract
While probabilistic graphical models can discover latent structure in data, their effectiveness hinges on choosing well-specified models. Identifying such models is challenging in practice, often requiring iterative checking and revision through trial and error. To this end, we propose meta-probabilistic modeling (MPM), a meta-learning algorithm that learns generative model structure directly from multiple related datasets. MPM uses a hierarchical architecture where global model specifications are shared across datasets while local parameters remain dataset-specific. For learning and inference, we propose a tractable VAE-inspired surrogate objective, and optimize it through bi-level optimization: local variables are updated analytically via coordinate ascent, while global parameters are trained with gradient-based methods. We evaluate MPM on object-centric image modeling and sequential text modeling, demonstrating that it adapts generative models to data while recovering meaningful latent representations.
Problem

Research questions and friction points this paper is trying to address.

probabilistic modeling
model specification
latent structure
meta-learning
generative models
Innovation

Methods, ideas, or system contributions that make the work stand out.

meta-probabilistic modeling
hierarchical generative models
bi-level optimization
latent structure learning
VAE-inspired surrogate
🔎 Similar Papers
No similar papers found.
K
Kevin Zhang
Department of Electric Engineering and Computer Science, MIT, Cambridge, USA
Yixin Wang
Yixin Wang
University of Michigan
Bayesian statisticsMachine Learning