🤖 AI Summary
This paper addresses the challenge of global optimization of black-box functions over mixed (continuous and discrete) input spaces in Bayesian optimization. We propose BARK, the first framework that constructs a fully Bayesian Gaussian process prior using Bayesian additive regression trees (BART). Its core contributions are threefold: (i) it introduces the first Bayesian treatment of tree structure itself, jointly sampling function values and tree topology via MCMC to explicitly quantify structural uncertainty; (ii) it designs a differentiable tree kernel enabling analytic derivation of acquisition functions; and (iii) it natively supports optimization over mixed-variable domains. Empirical evaluation on synthetic and real-world benchmarks demonstrates that BARK significantly outperforms state-of-the-art tree-based methods (e.g., SMAC, TPE) and standard Gaussian process approaches—particularly exhibiting superior robustness and faster convergence in high-dimensional mixed-variable settings.
📝 Abstract
We perform Bayesian optimization using a Gaussian process perspective on Bayesian Additive Regression Trees (BART). Our BART Kernel (BARK) uses tree agreement to define a posterior over piecewise-constant functions, and we explore the space of tree kernels using a Markov chain Monte Carlo approach. Where BART only samples functions, the resulting BARK model obtains samples of Gaussian processes defining distributions over functions, which allow us to build acquisition functions for Bayesian optimization. Our tree-based approach enables global optimization over the surrogate, even for mixed-feature spaces. Moreover, where many previous tree-based kernels provide uncertainty quantification over function values, our sampling scheme captures uncertainty over the tree structure itself. Our experiments show the strong performance of BARK on both synthetic and applied benchmarks, due to the combination of our fully Bayesian surrogate and the optimization procedure.