Uncertainty Calibration for Counterfactual Propensity Estimation in Recommendation

📅 2023-03-23
🏛️ IEEE Transactions on Knowledge and Data Engineering
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In recommendation systems, click-through conversion rate (CVR) prediction suffers from severe bias due to user self-selection and system exposure bias. Existing inverse propensity scoring (IPS) methods are limited by uncertainty in propensity score estimation, resulting in suboptimal debiasing performance. This paper proposes an uncertainty-calibrated counterfactual propensity score estimation framework. We introduce the Expected Calibration Error (ECE) as a model-agnostic metric for evaluating propensity score quality and design a corresponding uncertainty calibration mechanism. We theoretically prove that calibrated propensity scores yield tighter bounds on both bias and generalization error. Experiments on Coat, Yahoo, and KuaiRand datasets demonstrate that our method significantly reduces ECE, improves CVR prediction accuracy and robustness, and substantively enhances the debiasing efficacy of IPS.
📝 Abstract
Post-click conversion rate (CVR) is a reliable indicator of online customers' preferences, making it crucial for developing recommender systems. A major challenge in predicting CVR is severe selection bias, arising from users' inherent self-selection behavior and the system's item selection process. To mitigate this issue, the inverse propensity score (IPS) is employed to weight the prediction error of each observed instance. However, current propensity score estimations are unreliable due to the lack of a quality measure. To address this, we evaluate the quality of propensity scores from the perspective of uncertainty calibration, proposing the use of Expected Calibration Error (ECE) as a measure of propensity-score quality, which quantifies the extent to which predicted probabilities are overconfident by assessing the difference between predicted probabilities and actual observed frequencies. Miscalibrated propensity scores can lead to distorted IPS weights, thereby compromising the debiasing process in CVR prediction. In this paper, we introduce a model-agnostic calibration framework for propensity-based debiasing of CVR predictions. Theoretical analysis on bias and generalization bounds demonstrates the superiority of calibrated propensity estimates over uncalibrated ones. Experiments conducted on the Coat, Yahoo and KuaiRand datasets show improved uncertainty calibration, as evidenced by lower ECE values, leading to enhanced CVR prediction outcomes.
Problem

Research questions and friction points this paper is trying to address.

Mitigating selection bias in post-click conversion rate prediction
Improving reliability of inverse propensity score estimation
Proposing uncertainty calibration for better debiasing in recommendation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses Expected Calibration Error for propensity scores
Model-agnostic calibration framework for debiasing
Improves CVR prediction with calibrated propensity estimates
🔎 Similar Papers
No similar papers found.
W
Wenbo Hu
School of Computer and Information, Hefei University of Technology, Hefei, China
X
Xin Sun
Q
Q. Liu
Member, IEEE
S
Shu Wu