Contrastive Normalizing Flows for Uncertainty-Aware Parameter Estimation

📅 2025-05-13
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In high-energy physics and similar domains, systematic uncertainties—such as detector calibration biases—induce distributional shifts that degrade parameter estimation accuracy and uncertainty calibration. To address this, we propose Contrastive Normalizing Flows (CNF), the first framework integrating contrastive learning with normalizing flows. CNF directly models the likelihood ratio between parameters via a tunable contrastive distribution, bypassing computationally prohibitive high-dimensional parameter-grid simulations. It combines an embedded joint data-parameter mapping with a binary classifier-based likelihood-ratio approximation, augmented by frequentist statistical post-processing. Theoretically grounded and distributionally robust, CNF enables uncertainty-aware inference. Evaluated on the HiggsML Uncertainty Challenge dataset, CNF achieves significant improvements in estimation accuracy and uncertainty calibration, demonstrating strong robustness against diverse distributional distortions. This work establishes a novel paradigm for physics parameter estimation under systematic uncertainties.

Technology Category

Application Category

📝 Abstract
Estimating physical parameters from data is a crucial application of machine learning (ML) in the physical sciences. However, systematic uncertainties, such as detector miscalibration, induce data distribution distortions that can erode statistical precision. In both high-energy physics (HEP) and broader ML contexts, achieving uncertainty-aware parameter estimation under these domain shifts remains an open problem. In this work, we address this challenge of uncertainty-aware parameter estimation for a broad set of tasks critical for HEP. We introduce a novel approach based on Contrastive Normalizing Flows (CNFs), which achieves top performance on the HiggsML Uncertainty Challenge dataset. Building on the insight that a binary classifier can approximate the model parameter likelihood ratio, we address the practical limitations of expressivity and the high cost of simulating high-dimensional parameter grids by embedding data and parameters in a learned CNF mapping. This mapping yields a tunable contrastive distribution that enables robust classification under shifted data distributions. Through a combination of theoretical analysis and empirical evaluations, we demonstrate that CNFs, when coupled with a classifier and established frequentist techniques, provide principled parameter estimation and uncertainty quantification through classification that is robust to data distribution distortions.
Problem

Research questions and friction points this paper is trying to address.

Estimating physical parameters under data distortions
Addressing uncertainty-aware parameter estimation in HEP
Robust classification under shifted data distributions
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses Contrastive Normalizing Flows for parameter estimation
Embeds data in learned CNF mapping for robustness
Combines CNFs with classifiers for uncertainty quantification
🔎 Similar Papers
No similar papers found.
I
Ibrahim Elsharkawy
Department of Physics, University of Illinois Urbana-Champaign
Yonatan Kahn
Yonatan Kahn
Assistant Professor
Physics