On the interplay between prior weight and variance of the robustification component in Robust Mixture Prior Bayesian Dynamic Borrowing approach

📅 2025-09-01
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In robust Bayesian dynamic borrowing with mixture priors, prior weights and the variance of the robust component are often tuned independently, overlooking their interdependent impact on posterior inference. Method: We propose a joint optimization framework that integrates Bayesian mixture modeling, asymptotic analysis, and posterior inference to systematically investigate their synergistic effects. Through theoretical analysis and simulation studies, we identify feasible weight–variance combinations yielding approximately equivalent posteriors and demonstrate that increasing the robust component’s variance mitigates Lindley’s paradox, improves Type I error control, and enhances location-parameter robustness. Contribution/Results: Our framework establishes a principled, operational procedure for hyperparameter selection. It significantly improves statistical efficiency and robustness in mixed-control trials, particularly under high-dimensional or small-sample settings, offering a novel paradigm for cross-group information borrowing in complex clinical trial designs.

Technology Category

Application Category

📝 Abstract
Robust Mixture Prior (RMP) is a popular Bayesian dynamic borrowing method, which combines an informative historical distribution with a less informative component (referred as robustification component) in a mixture prior to enhance the efficiency of hybrid-control randomized trials. Current practice typically focuses solely on the selection of the prior weight that governs the relative influence of these two components, often fixing the variance of the robustification component to that of a single observation. In this study we demonstrate that the performance of RMPs critically depends on the joint selection of both weight and variance of the robustification component. In particular, we show that a wide range of weight-variance pairs can yield practically identical posterior inferences (in particular regions of the parameter space) and that large variance robust components may be employed without incurring in the so called Lindley's paradox. We further show that the use of large variance robustification components leads to improved asymptotic Type I error control and enhanced robustness of the RMP to the specification of the location parameter of the robustification component. Finally, we leverage these theoretical results to propose a novel and practical hyper-parameter elicitation routine.
Problem

Research questions and friction points this paper is trying to address.

Joint selection of weight and variance in Robust Mixture Prior
Avoiding Lindley's paradox with large variance robust components
Improving Type I error control through variance specification
Innovation

Methods, ideas, or system contributions that make the work stand out.

Joint selection of weight and variance parameters
Large variance robust components avoiding Lindley's paradox
Novel hyper-parameter elicitation routine implementation
🔎 Similar Papers
No similar papers found.
M
Marco Ratta
Department of Mathematical Sciences, Polytechnic University of Turin
G
Gaelle Saint-Hilary
Department of Statistical Methodology, Saryga
Mauro Gasparini
Mauro Gasparini
Unknown affiliation
Pavel Mozgunov
Pavel Mozgunov
Programme Leader Track at MRC Biostatistics Unit
Early Phase Clinical TrialsAdaptive DesignsMedical StatisticsBiostatisticsDose Finding