Kernel Learning for Sample Constrained Black-Box Optimization

📅 2025-07-28
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
For high-dimensional black-box optimization (BBO) under extremely limited sample budgets, this paper proposes a Gaussian process (GP) kernel learning method that constructs a continuous kernel space via a variational autoencoder (VAE). The core innovation lies in mapping kernel functions into the VAE’s latent space, enabling compact representation and differentiable optimization of kernel structures; an auxiliary optimizer is further introduced to efficiently search for the optimal kernel tailored to the target function within this latent space. Unlike conventional approaches relying on discrete kernel enumeration or heuristic design, our method substantially improves GP generalization and optimization efficiency in the small-sample regime. Experiments on synthetic benchmarks and real-world tasks—including auditory device personalization and generative model hyperparameter tuning—demonstrate that our approach reduces sampling requirements by 30–50% on average compared to state-of-the-art methods, validating its effectiveness and practical utility.

Technology Category

Application Category

📝 Abstract
Black box optimization (BBO) focuses on optimizing unknown functions in high-dimensional spaces. In many applications, sampling the unknown function is expensive, imposing a tight sample budget. Ongoing work is making progress on reducing the sample budget by learning the shape/structure of the function, known as kernel learning. We propose a new method to learn the kernel of a Gaussian Process. Our idea is to create a continuous kernel space in the latent space of a variational autoencoder, and run an auxiliary optimization to identify the best kernel. Results show that the proposed method, Kernel Optimized Blackbox Optimization (KOBO), outperforms state of the art by estimating the optimal at considerably lower sample budgets. Results hold not only across synthetic benchmark functions but also in real applications. We show that a hearing aid may be personalized with fewer audio queries to the user, or a generative model could converge to desirable images from limited user ratings.
Problem

Research questions and friction points this paper is trying to address.

Optimizing unknown functions with limited samples
Learning Gaussian Process kernels efficiently
Reducing sample costs in real-world applications
Innovation

Methods, ideas, or system contributions that make the work stand out.

Continuous kernel space in VAE latent space
Auxiliary optimization for best kernel identification
KOBO outperforms state-of-the-art with fewer samples
🔎 Similar Papers
No similar papers found.