Bayesian Calibration for Prediction in a Multi-Output Transposition Context

📅 2024-09-30
🏛️ International Journal for Uncertainty Quantification
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of uncertainty quantification for multi-output numerical simulators under “transposed settings,” where measurements are available for only a subset of outputs. We propose a hierarchical Bayesian calibration framework that jointly models simulator discrepancies across outputs. Crucially, we embed hyperparameterized priors and auxiliary numerical input parameters within the hierarchical structure—enabling reliable prediction and uncertainty calibration for unobserved outputs for the first time. Unlike conventional independent-output calibration approaches, our method explicitly captures inter-output discrepancy correlations, thereby preserving cross-output information and avoiding fragmentation. Experiments on the Taylor cylinder impact test (a three-output benchmark) demonstrate substantial improvements: average RMSE for unobserved outputs decreases by 32%, and prediction interval coverage probability (PICP) increases to 94%. The proposed framework establishes a new paradigm for trustworthy prediction in high-fidelity, multiphysics simulation.

Technology Category

Application Category

📝 Abstract
Numerical simulations are widely used to predict the behavior of physical systems, with Bayesian approaches being particularly well suited for this purpose. However, experimental observations are necessary to calibrate certain simulator parameters for the prediction. In this work, we use a multi-output simulator to predict all its outputs, including those that have never been experimentally observed. This situation is referred to as the transposition context. To accurately quantify the discrepancy between model outputs and real data in this context, conventional methods cannot be applied, and the Bayesian calibration must be augmented by incorporating a joint model error across all outputs. To achieve this, the proposed method is to consider additional numerical input parameters within a hierarchical Bayesian model, which includes hyperparameters for the prior distribution of the calibration variables. This approach is applied on a computer code with three outputs that models the Taylor cylinder impact test with a small number of observations. The outputs are considered as the observed variables one at a time, to work with three different transposition situations. The proposed method is compared with other approaches that embed model errors to demonstrate the significance of the hierarchical formulation.
Problem

Research questions and friction points this paper is trying to address.

Calibrating multi-output simulators for unobserved physical system predictions
Quantifying model-data discrepancy in transposition contexts using Bayesian methods
Developing hierarchical Bayesian models with joint error across all outputs
Innovation

Methods, ideas, or system contributions that make the work stand out.

Bayesian calibration augmented with joint model error
Hierarchical Bayesian model with additional input parameters
Multi-output simulator for unobserved outputs prediction
🔎 Similar Papers
No similar papers found.
C
Charlie Sire
Inria Saclay Centre, Palaiseau, France
Josselin Garnier
Josselin Garnier
Ecole Polytechnique
Applied Mathematics
C
Cédric Durantin
CEA, DAM, DIF, F-91297 Arpajon, France
B
Baptiste Kerleguer
CEA, DAM, DIF, F-91297 Arpajon, France
G
Gilles Defaux
CEA, DAM, DIF, F-91297 Arpajon, France
G
Guillaume Perrin
COSYS, Universite Gustave Eiffel, Marne-La-Vallée, France