🤖 AI Summary
Existing synthetic data generation methods require users to specify point estimates for critical parameters—such as treatment effects and confounding biases—in advance, failing to characterize their uncertainty and lacking mechanisms for updating these parameters from source data, thereby compromising the reliability of causal estimator evaluation. To address this, we propose SBICE (Simulation-Based Inference for Causal Evaluation), a simulation-based inference framework that models generative parameters as random variables and infers their posterior distributions directly from observed source data. This enables adaptive parameter optimization and principled uncertainty quantification. SBICE supports both posterior inference and generation of distributionally faithful synthetic data. Empirical results demonstrate that SBICE-generated data better approximate the true data distribution and significantly improve the reliability and robustness of causal estimator evaluation across multiple benchmark tasks.
📝 Abstract
Generating synthetic datasets that accurately reflect real-world observational data is critical for evaluating causal estimators, but remains a challenging task. Existing generative methods offer a solution by producing synthetic datasets anchored in the observed data (source data) while allowing variation in key parameters such as the treatment effect and amount of confounding bias. However, existing methods typically require users to provide point estimates of such parameters (rather than distributions) and fixed estimates (rather than estimates that can be improved with reference to the source data). This denies users the ability to express uncertainty over parameter values and removes the potential for posterior inference, potentially leading to unreliable estimator comparisons. We introduce simulation-based inference for causal evaluation (SBICE), a framework that models generative parameters as uncertain and infers their posterior distribution given a source dataset. Leveraging techniques in simulation-based inference, SBICE identifies parameter configurations that produce synthetic datasets closely aligned with the source data distribution. Empirical results demonstrate that SBICE improves the reliability of estimator evaluations by generating more realistic datasets, which supports a robust and data-consistent approach to causal benchmarking under uncertainty.