Meta-learning of Gibbs states for many-body Hamiltonians with applications to Quantum Boltzmann Machines

📅 2025-07-22
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the low efficiency and poor generalizability of preparing quantum Gibbs states for parameterized many-body Hamiltonians on NISQ devices, this work pioneers the integration of meta-learning into quantum thermal state preparation, proposing two algorithms: Meta-VQT and NN-Meta-VQT. Our approach synergistically combines variational quantum imaginary-time evolution (VarQITE), a meta-variational eigensolver framework, and problem-driven ansatz design, supporting both fully quantum and quantum-classical hybrid architectures; it further leverages collective optimization and neural-network-assisted initialization. Validated on an 8-qubit system, the algorithms demonstrate cross-temperature and cross-model-parameter transferability, enabling high-fidelity Gibbs state preparation for unseen Hamiltonian parameters. In quantum Boltzmann machine training, convergence accelerates by up to 30× over standard VarQITE, while initialization quality improves significantly. The framework exhibits strong scalability and robust generalization across diverse physical models.

Technology Category

Application Category

📝 Abstract
The preparation of quantum Gibbs states is a fundamental challenge in quantum computing, essential for applications ranging from modeling open quantum systems to quantum machine learning. Building on the Meta-Variational Quantum Eigensolver framework proposed by Cervera-Lierta et al.(2021) and a problem driven ansatz design, we introduce two meta-learning algorithms: Meta-Variational Quantum Thermalizer (Meta-VQT) and Neural Network Meta-VQT (NN-Meta VQT) for efficient thermal state preparation of parametrized Hamiltonians on Noisy Intermediate-Scale Quantum (NISQ) devices. Meta-VQT utilizes a fully quantum ansatz, while NN Meta-VQT integrates a quantum classical hybrid architecture. Both leverage collective optimization over training sets to generalize Gibbs state preparation to unseen parameters. We validate our methods on upto 8-qubit Transverse Field Ising Model and the 2-qubit Heisenberg model with all field terms, demonstrating efficient thermal state generation beyond training data. For larger systems, we show that our meta-learned parameters when combined with appropriately designed ansatz serve as warm start initializations, significantly outperforming random initializations in the optimization tasks. Furthermore, a 3- qubit Kitaev ring example showcases our algorithm's effectiveness across finite-temperature crossover regimes. Finally, we apply our algorithms to train a Quantum Boltzmann Machine (QBM) on a 2-qubit Heisenberg model with all field terms, achieving enhanced training efficiency, improved Gibbs state accuracy, and a 30-fold runtime speedup over existing techniques such as variational quantum imaginary time (VarQITE)-based QBM highlighting the scalability and practicality of meta-algorithm-based QBMs.
Problem

Research questions and friction points this paper is trying to address.

Efficient preparation of quantum Gibbs states for parametrized Hamiltonians
Generalizing Gibbs state preparation to unseen parameters via meta-learning
Enhancing Quantum Boltzmann Machine training efficiency and accuracy
Innovation

Methods, ideas, or system contributions that make the work stand out.

Meta-learning algorithms for quantum thermal state preparation
Quantum-classical hybrid architecture for Gibbs states
Warm start initializations for larger quantum systems
🔎 Similar Papers
No similar papers found.