🤖 AI Summary
This work addresses high-query-cost or high-difficulty NP-hard black-box combinatorial optimization problems. Methodologically, it introduces the first end-to-end neural solver that integrates simulated annealing principles into generative modeling: the objective function is treated as an energy function, and a temperature-conditioned conditional generative network is designed to implicitly learn the Boltzmann distribution—enabling continuous evolution from uniform sampling at high temperatures to focused sampling near optima at low temperatures. A differentiable temperature schedule and end-to-end training framework are proposed, supporting data augmentation and implicit modeling of variable interactions. Compared to conventional black-box optimizers, the approach achieves significant improvements in solution quality and sample efficiency across multiple NP-hard tasks—under both finite and infinite query budgets—thereby advancing the performance and applicability frontiers of black-box optimization.
📝 Abstract
We propose a generative, end-to-end solver for black-box combinatorial optimization that emphasizes both sample efficiency and solution quality on NP problems. Drawing inspiration from annealing-based algorithms, we treat the black-box objective as an energy function and train a neural network to model the associated Boltzmann distribution. By conditioning on temperature, the network captures a continuum of distributions--from near-uniform at high temperatures to sharply peaked around global optima at low temperatures--thereby learning the structure of the energy landscape and facilitating global optimization. When queries are expensive, the temperature-dependent distributions naturally enable data augmentation and improve sample efficiency. When queries are cheap but the problem remains hard, the model learns implicit variable interactions, effectively"opening"the black box. We validate our approach on challenging combinatorial tasks under both limited and unlimited query budgets, showing competitive performance against state-of-the-art black-box optimizers.