A Generative Neural Annealer for Black-Box Combinatorial Optimization

📅 2025-05-14
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses high-query-cost or high-difficulty NP-hard black-box combinatorial optimization problems. Methodologically, it introduces the first end-to-end neural solver that integrates simulated annealing principles into generative modeling: the objective function is treated as an energy function, and a temperature-conditioned conditional generative network is designed to implicitly learn the Boltzmann distribution—enabling continuous evolution from uniform sampling at high temperatures to focused sampling near optima at low temperatures. A differentiable temperature schedule and end-to-end training framework are proposed, supporting data augmentation and implicit modeling of variable interactions. Compared to conventional black-box optimizers, the approach achieves significant improvements in solution quality and sample efficiency across multiple NP-hard tasks—under both finite and infinite query budgets—thereby advancing the performance and applicability frontiers of black-box optimization.

Technology Category

Application Category

📝 Abstract
We propose a generative, end-to-end solver for black-box combinatorial optimization that emphasizes both sample efficiency and solution quality on NP problems. Drawing inspiration from annealing-based algorithms, we treat the black-box objective as an energy function and train a neural network to model the associated Boltzmann distribution. By conditioning on temperature, the network captures a continuum of distributions--from near-uniform at high temperatures to sharply peaked around global optima at low temperatures--thereby learning the structure of the energy landscape and facilitating global optimization. When queries are expensive, the temperature-dependent distributions naturally enable data augmentation and improve sample efficiency. When queries are cheap but the problem remains hard, the model learns implicit variable interactions, effectively"opening"the black box. We validate our approach on challenging combinatorial tasks under both limited and unlimited query budgets, showing competitive performance against state-of-the-art black-box optimizers.
Problem

Research questions and friction points this paper is trying to address.

Solving black-box combinatorial optimization efficiently
Modeling Boltzmann distribution for global optimization
Improving sample efficiency in NP problems
Innovation

Methods, ideas, or system contributions that make the work stand out.

Generative neural network models Boltzmann distribution
Temperature conditioning enables global optimization
Learns implicit variable interactions for black-box
🔎 Similar Papers
No similar papers found.
Yuan-Hang Zhang
Yuan-Hang Zhang
Department of Physics, University of California, San Diego
AI for physicsUnconventional computingMemComputing
M
Massimiliano Di Ventra
Department of Physics, University of California San Diego