🤖 AI Summary
This study investigates whether and under what conditions latent distributions generated by quantum processors can enhance generative model performance, and rigorously assesses their reproducibility. Methodologically, we integrate real photonic quantum hardware with classical simulation to embed latent spaces of GANs, diffusion models, and flow-matching models, evaluating performance systematically on synthetic quantum and QM9 molecular datasets. Our key contributions are threefold: (i) we introduce the first operational criterion for identifying when quantum latent distributions yield classically intractable distributions; (ii) we empirically demonstrate that, on specific structured tasks, quantum latent distributions significantly outperform multiple classical baselines—including standard Gaussian priors and VAE encoder outputs; and (iii) we establish that near-term quantum hardware can concretely augment generative modeling capabilities. These findings provide both theoretical grounding and a practical framework for generative quantum machine learning.
📝 Abstract
Many successful families of generative models leverage a low-dimensional latent distribution that is mapped to a data distribution. Though simple latent distributions are commonly used, it has been shown that more sophisticated distributions can improve performance. For instance, recent work has explored using the distributions produced by quantum processors and found empirical improvements. However, when latent space distributions produced by quantum processors can be expected to improve performance, and whether these improvements are reproducible, are open questions that we investigate in this work. We prove that, under certain conditions, these "quantum latent distributions" enable generative models to produce data distributions that classical latent distributions cannot efficiently produce. We also provide actionable intuitions to identify when such quantum advantages may arise in real-world settings. We perform benchmarking experiments on both a synthetic quantum dataset and the QM9 molecular dataset, using both simulated and real photonic quantum processors. Our results demonstrate that quantum latent distributions can lead to improved generative performance in GANs compared to a range of classical baselines. We also explore diffusion and flow matching models, identifying architectures compatible with quantum latent distributions. This work confirms that near-term quantum processors can expand the capabilities of deep generative models.