Quantum latent distributions in deep generative models

📅 2025-08-27
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study investigates whether and under what conditions latent distributions generated by quantum processors can enhance generative model performance, and rigorously assesses their reproducibility. Methodologically, we integrate real photonic quantum hardware with classical simulation to embed latent spaces of GANs, diffusion models, and flow-matching models, evaluating performance systematically on synthetic quantum and QM9 molecular datasets. Our key contributions are threefold: (i) we introduce the first operational criterion for identifying when quantum latent distributions yield classically intractable distributions; (ii) we empirically demonstrate that, on specific structured tasks, quantum latent distributions significantly outperform multiple classical baselines—including standard Gaussian priors and VAE encoder outputs; and (iii) we establish that near-term quantum hardware can concretely augment generative modeling capabilities. These findings provide both theoretical grounding and a practical framework for generative quantum machine learning.

Technology Category

Application Category

📝 Abstract
Many successful families of generative models leverage a low-dimensional latent distribution that is mapped to a data distribution. Though simple latent distributions are commonly used, it has been shown that more sophisticated distributions can improve performance. For instance, recent work has explored using the distributions produced by quantum processors and found empirical improvements. However, when latent space distributions produced by quantum processors can be expected to improve performance, and whether these improvements are reproducible, are open questions that we investigate in this work. We prove that, under certain conditions, these "quantum latent distributions" enable generative models to produce data distributions that classical latent distributions cannot efficiently produce. We also provide actionable intuitions to identify when such quantum advantages may arise in real-world settings. We perform benchmarking experiments on both a synthetic quantum dataset and the QM9 molecular dataset, using both simulated and real photonic quantum processors. Our results demonstrate that quantum latent distributions can lead to improved generative performance in GANs compared to a range of classical baselines. We also explore diffusion and flow matching models, identifying architectures compatible with quantum latent distributions. This work confirms that near-term quantum processors can expand the capabilities of deep generative models.
Problem

Research questions and friction points this paper is trying to address.

Investigating when quantum latent distributions improve generative model performance
Determining reproducibility of quantum advantages in generative modeling
Identifying architectures compatible with quantum latent distributions
Innovation

Methods, ideas, or system contributions that make the work stand out.

Quantum latent distributions enhance generative models
Quantum advantages proven under specific conditions
Real-world quantum processors improve GAN performance
🔎 Similar Papers
No similar papers found.
Omar Bacarreza
Omar Bacarreza
Senior Machine Learning Scientist at ORCA computing
T
Thorin Farnsworth
ORCA Computing
A
Alexander Makarovskiy
ORCA Computing
H
Hugo Wallner
ORCA Computing
T
Tessa Hicks
ORCA Computing
S
Santiago Sempere-Llagostera
ORCA Computing
J
John Price
ORCA Computing
R
Robert J. A. Francis-Jones
ORCA Computing
William R. Clements
William R. Clements
ORCA Computing
Machine LearningQuantum OpticsPhotonics