Dense associative memory on the Bures-Wasserstein space

📅 2025-09-27
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing dense associative memories (DAMs) operate exclusively on vector representations and lack the capacity to model uncertainty inherent in probabilistic data. Method: This work introduces the first DAM framework operating in the space of probability distributions, specifically the Gaussian family endowed with the Bures–Wasserstein metric. We define an energy function based on the 2-Wasserstein distance, using the Wasserstein barycenter as a fixed point, and perform dynamic retrieval via Gibbs-weighted optimal transport mappings for aggregation. Contribution/Results: We prove that the proposed memory achieves exponential storage capacity. Experiments demonstrate high-accuracy distribution retrieval on both synthetic and real-world distributional tasks, robustness to Wasserstein perturbations, and quantifiable recovery guarantees. By unifying associative memory with generative modeling principles, this work establishes a novel paradigm bridging distributional representation learning and memory-based probabilistic reasoning.

Technology Category

Application Category

📝 Abstract
Dense associative memories (DAMs) store and retrieve patterns via energy-functional fixed points, but existing models are limited to vector representations. We extend DAMs to probability distributions equipped with the 2-Wasserstein distance, focusing mainly on the Bures-Wasserstein class of Gaussian densities. Our framework defines a log-sum-exp energy over stored distributions and a retrieval dynamics aggregating optimal transport maps in a Gibbs-weighted manner. Stationary points correspond to self-consistent Wasserstein barycenters, generalizing classical DAM fixed points. We prove exponential storage capacity, provide quantitative retrieval guarantees under Wasserstein perturbations, and validate the model on synthetic and real-world distributional tasks. This work elevates associative memory from vectors to full distributions, bridging classical DAMs with modern generative modeling and enabling distributional storage and retrieval in memory-augmented learning.
Problem

Research questions and friction points this paper is trying to address.

Extends associative memory to probability distributions using Wasserstein distance
Defines energy and retrieval dynamics for Gaussian distribution storage
Enables distributional storage and retrieval in memory-augmented learning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Extends associative memory to probability distributions using Wasserstein distance
Defines energy function and retrieval dynamics via optimal transport maps
Achieves exponential storage capacity with distributional retrieval guarantees
🔎 Similar Papers
No similar papers found.