🤖 AI Summary
This work addresses the problem of estimating small quantile sets for expensive black-box multivariate functions under mixed deterministic and uncertain inputs—specifically, identifying the set of deterministic inputs for which the probability of the output falling within a target region remains below a prescribed threshold. To overcome the low efficiency and poor accuracy of conventional approaches, we propose the Expected Estimation Modifier (EEM) criterion and introduce a novel Bayesian active learning framework that synergistically integrates sequential Monte Carlo with batch-sequential sampling, coupled with Gaussian process modeling and rigorous uncertainty quantification. Evaluated on multiple synthetic benchmarks and an industrial case study involving the ROTOR37 compressor, the method achieves high-accuracy quantile set localization with significantly fewer function evaluations—improving convergence speed by over 40%—while enhancing both the efficiency and robustness of critical input identification.
📝 Abstract
Given a multivariate function taking deterministic and uncertain inputs, we consider the problem of estimating a quantile set: a set of deterministic inputs for which the probability that the output belongs to a specific region remains below a given threshold. To solve this problem in the context of expensive-to-evaluate black-box functions, we propose a Bayesian active learning strategy based on Gaussian process modeling. The strategy is driven by a novel sampling criterion, which belongs to a broader principle that we refer to as Expected Estimator Modification (EEM). More specifically, the strategy relies on a novel sampling criterion combined with a sequential Monte Carlo framework that enables the construction of batch-sequential designs for the efficient estimation of small quantile sets. The performance of the strategy is illustrated on several synthetic examples and an industrial application case involving the ROTOR37 compressor model.