Local Constrained Bayesian Optimization

📅 2026-03-09
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenges of high-dimensional constrained Bayesian optimization, where the curse of dimensionality and premature contraction of traditional trust-region methods hinder performance. The authors propose the Local Constrained Bayesian Optimization (LCBO) framework, which alternates between rapid local descent and uncertainty-driven exploration on a differentiable surrogate model incorporating constraint penalties. LCBO is the first method to provide theoretical guarantees for high-dimensional constrained settings, achieving a polynomial-in-dimension convergence rate in terms of Karush–Kuhn–Tucker (KKT) residual—thereby overcoming the exponential regret bounds inherent to global approaches. Empirical evaluations on benchmarks up to 100 dimensions demonstrate that LCBO significantly outperforms state-of-the-art methods, exhibiting superior optimization performance and stability.

Technology Category

Application Category

📝 Abstract
Bayesian optimization (BO) for high-dimensional constrained problems remains a significant challenge due to the curse of dimensionality. We propose Local Constrained Bayesian Optimization (LCBO), a novel framework tailored for such settings. Unlike trust-region methods that are prone to premature shrinking when confronting tight or complex constraints, LCBO leverages the differentiable landscape of constraint-penalized surrogates to alternate between rapid local descent and uncertainty-driven exploration. Theoretically, we prove that LCBO achieves a convergence rate for the Karush-Kuhn-Tucker (KKT) residual that depends polynomially on the dimension $d$ for common kernels under mild assumptions, offering a rigorous alternative to global BO where regret bounds typically scale exponentially. Extensive evaluations on high-dimensional benchmarks (up to 100D) demonstrate that LCBO consistently outperforms state-of-the-art baselines.
Problem

Research questions and friction points this paper is trying to address.

Bayesian optimization
high-dimensional optimization
constrained optimization
curse of dimensionality
KKT residual
Innovation

Methods, ideas, or system contributions that make the work stand out.

Local Constrained Bayesian Optimization
high-dimensional optimization
constraint handling
KKT convergence
differentiable surrogate
🔎 Similar Papers
No similar papers found.
J
Jingzhe Jing
Academy of Mathematics and Systems Science, Chinese Academy of Sciences, Beijing, China
Z
Zheyi Fan
Academy of Mathematics and Systems Science, Chinese Academy of Sciences, Beijing, China
S
Szu Hui Ng
Department of Industrial Systems Engineering & Management, National University of Singapore, Singapore
Qingpei Hu
Qingpei Hu
Professor of Chinese Academy of Sciences
Industrial StatisticsSystem Reliability