Natural Evolutionary Search meets Probabilistic Numerics

πŸ“… 2025-07-09
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
To address the low sample efficiency of Natural Evolution Strategies (NES) in black-box optimization, this paper proposes ProbNESβ€”the first probabilistic NES algorithm that integrates Bayesian quadrature into the NES framework. ProbNES models the objective function via a prior distribution and employs probabilistic inference to accurately estimate the natural gradient, thereby significantly improving gradient estimation accuracy and sampling efficiency in zero-order optimization. It requires no first-order information and is particularly effective for continuous optimization problems with structured priors. Extensive experiments across benchmark functions, data-driven optimization, user-perceived hyperparameter tuning, and robotic motion control demonstrate that ProbNES consistently outperforms standard NES, CMA-ES, and Bayesian optimization, validating its superior efficiency and generalizability.

Technology Category

Application Category

πŸ“ Abstract
Zeroth-order local optimisation algorithms are essential for solving real-valued black-box optimisation problems. Among these, Natural Evolution Strategies (NES) represent a prominent class, particularly well-suited for scenarios where prior distributions are available. By optimising the objective function in the space of search distributions, NES algorithms naturally integrate prior knowledge during initialisation, making them effective in settings such as semi-supervised learning and user-prior belief frameworks. However, due to their reliance on random sampling and Monte Carlo estimates, NES algorithms can suffer from limited sample efficiency. In this paper, we introduce a novel class of algorithms, termed Probabilistic Natural Evolutionary Strategy Algorithms (ProbNES), which enhance the NES framework with Bayesian quadrature. We show that ProbNES algorithms consistently outperforms their non-probabilistic counterparts as well as global sample efficient methods such as Bayesian Optimisation (BO) or $Ο€$BO across a wide range of tasks, including benchmark test functions, data-driven optimisation tasks, user-informed hyperparameter tuning tasks and locomotion tasks.
Problem

Research questions and friction points this paper is trying to address.

Enhancing sample efficiency in black-box optimization
Integrating prior knowledge into evolutionary strategies
Outperforming existing methods in diverse optimization tasks
Innovation

Methods, ideas, or system contributions that make the work stand out.

Integrates Bayesian quadrature into NES
Enhances sample efficiency with ProbNES
Outperforms non-probabilistic and global methods
πŸ”Ž Similar Papers
No similar papers found.
P
Pierre Osselin
Machine Learning Research Group, University of Oxford
Masaki Adachi
Masaki Adachi
Group Manager, Toyota Motor Corporation
Machine LearningBayesian optimizationMaterials Sceince
Xiaowen Dong
Xiaowen Dong
University of Oxford
signal processingmachine learningnetwork sciencecomputational social science
M
Michael A. Osborne
Machine Learning Research Group, University of Oxford