An Information-Theoretic Framework For Optimizing Experimental Design To Distinguish Probabilistic Neural Codes

📅 2026-03-01
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study addresses the fundamental question of whether sensory neural populations encode likelihood functions or posterior distributions to represent perceptual uncertainty. To disentangle these competing hypotheses, the authors propose an information-theoretic framework that introduces a novel metric—the “information gap”—defined as the Kullback–Leibler (KL) divergence between the neural response distributions implied by each encoding scheme. This metric quantifies the theoretical discriminability between the two hypotheses and is used to optimize the stimulus distribution for maximal experimental identifiability. The approach integrates Bayesian modeling, KL divergence analysis, and neural decoder simulations. Results from synthetic experiments demonstrate that the information gap accurately predicts differences in decoding performance and that the optimized stimulus distribution substantially enhances the ability to distinguish between likelihood and posterior coding in neural data.

Technology Category

Application Category

📝 Abstract
The Bayesian brain hypothesis has been a leading theory in understanding perceptual decision-making under uncertainty. While extensive psychophysical evidence supports the notion of the brain performing Bayesian computations, how uncertainty information is encoded in sensory neural populations remains elusive. Specifically, two competing hypotheses propose that early sensory populations encode either the likelihood function (exemplified by probabilistic population codes) or the posterior distribution (exemplified by neural sampling codes) over the stimulus, with the key distinction lying in whether stimulus priors would modulate the neural responses. However, experimentally differentiating these two hypotheses has remained challenging, as it is unclear what task design would effectively distinguish the two. In this work, we present an information-theoretic framework for optimizing the task stimulus distribution that would maximally differentiate competing probabilistic neural codes. To quantify how distinguishable the two probabilistic coding hypotheses are under a given task design, we derive the information gap--the expected performance difference when likelihood versus posterior decoders are applied to neural populations--by evaluating the Kullback-Leibler divergence between the true posterior and a task-marginalized surrogate posterior. Through extensive simulations, we demonstrate that the information gap accurately predicts decoder performance differences across diverse task settings. Critically, maximizing the information gap yields stimulus distributions that optimally differentiate likelihood and posterior coding hypotheses. Our framework enables principled, theory-driven experimental designs with maximal discriminative power to differentiate probabilistic neural codes, advancing our understanding of how neural populations represent and process sensory uncertainty.
Problem

Research questions and friction points this paper is trying to address.

probabilistic neural codes
experimental design
likelihood vs posterior
Bayesian brain
sensory uncertainty
Innovation

Methods, ideas, or system contributions that make the work stand out.

information-theoretic framework
probabilistic neural codes
information gap
experimental design optimization
Bayesian brain
🔎 Similar Papers
No similar papers found.
Po-Chen Kuo
Po-Chen Kuo
University of Washington
E
Edgar Y. Walker
Department of Neurobiology and Biophysics, University of Washington, Seattle, WA 98195, USA