🤖 AI Summary
This paper investigates how recommendation algorithms under noisy preference signals affect market concentration and welfare inequality. We consider a setting where users are partitioned into statistically majority and minority groups, and their group types are only identifiable via noisy signals. Building upon the Bayesian persuasion framework, we formulate an information design model and introduce the novel concept of “symmetric statistical experiments” to characterize informational constraints imposed by noise structure. Theoretically, we prove that—under symmetric noise—the optimal recommendation mechanism necessarily exacerbates market concentration and systematically reduces both utility and recommendation share for minority users, thereby widening inter-group welfare disparities. This result uncovers a previously unrecognized algorithmic bias propagation mechanism: even non-malicious algorithmic designs can endogenously reinforce inequality through the inherent structure of noisy signals. Our analysis provides a new theoretical benchmark for algorithmic fairness and yields actionable policy implications for market regulation.
📝 Abstract
Algorithmic recommendation based on noisy preference measurement is prevalent in recommendation systems. This paper discusses the consequences of such recommendation on market concentration and inequality. Binary types denoting a statistical majority and minority are noisily revealed through a statistical experiment. The achievable utilities and recommendation shares for the two groups can be analyzed as a Bayesian Persuasion problem. While under arbitrary noise structures, effects on concentration compared to a full-information market are ambiguous, under symmetric noise, concentration increases and consumer welfare becomes more unequal. We define symmetric statistical experiments and analyze persuasion under a restriction to such experiments, which may be of independent interest.