🤖 AI Summary
This paper investigates partition dependence in belief reporting—the systematic variation in individuals’ probability judgments about the same event depending on how the state space is partitioned. To address this, we propose the first entropy-regularized belief reporting model: it formalizes the trade-off between fidelity to an agent’s internal prior and avoidance of overcommitment by augmenting the KL-divergence minimization objective with an entropy term. The model is rigorously axiomatized and empirically validated against experimental data from Benjamin et al. (2017), successfully reproducing and explaining key partition dependence biases. This work introduces entropy regularization to belief reporting for the first time, offering a novel theoretical framework for non-commitment preferences and a testable, quantitative tool for modeling belief formation under ambiguity.
📝 Abstract
This paper investigates a model of partition dependence, a widely reported experimental finding where the agent's reported beliefs depend on how the states are grouped. In the model, called Entropy Regularized Belief Reporting (ERBR), the agent is endowed with a latent benchmark prior that is unobserved by the analyst. When presented with a partition, the agent reports a prior that minimizes Kullback-Leibler divergence from the latent benchmark prior subject to entropy regularization. This captures the intuition that while the agent would like to report a prior that is close to her latent benchmark prior, she may also have a preference to remain noncommittal. I axiomatically characterize the model and apply it to the experimental data from Benjamin et al. (2017).