Local Normalization Distortion and the Thermodynamic Formalism of Decoding Strategies for Large Language Models

📅 2025-03-27
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper identifies probability distortion induced by local normalization during large language model decoding as the fundamental reason why top-k sampling underperforms nucleus (top-p) sampling. We introduce the novel concept of “local normalization distortion” and establish a unified formal framework for decoding strategies grounded in thermodynamics and ergodic theory, modeling top-k, nucleus, and temperature sampling as equilibrium states of ergodic systems. Through probabilistic measure perturbation analysis and information-theoretic metric modeling, we quantitatively characterize how this distortion degrades generation quality and diversity. Our work enables a paradigm shift from heuristic parameter tuning to principle-driven decoding algorithm design, yielding theoretically grounded, interpretable, and optimizable decoders—thereby providing a new foundation for both controllable text generation and AI-generated content detection. (149 words)

Technology Category

Application Category

📝 Abstract
Advances in hardware and language model architecture have spurred a revolution in natural language generation. However, autoregressive models compute probability distributions over next-token choices, and sampling from these distributions, known as decoding, has received significantly less attention than other design choices. Existing decoding strategies are largely based on heuristics, resulting in methods that are hard to apply or improve in a principled manner. We develop the theory of decoding strategies for language models by expressing popular decoding algorithms as equilibrium states in the language of ergodic theory and stating the functions they optimize. Using this, we analyze the effect of the local normalization step of top-k, nucleus, and temperature sampling, used to make probabilities sum to one. We argue that local normalization distortion is a fundamental defect of decoding strategies and quantify the size of this distortion and its effect on mathematical proxies for the quality and diversity of generated text. Contrary to the prevailing explanation, we argue that the major cause of the under-performance of top-k sampling relative to nucleus sampling is local normalization distortion. This yields conclusions for the future design of decoding algorithms and the detection of machine-generated text.
Problem

Research questions and friction points this paper is trying to address.

Analyzing local normalization distortion in decoding strategies
Comparing performance of top-k and nucleus sampling methods
Developing theory for decoding strategies using ergodic theory
Innovation

Methods, ideas, or system contributions that make the work stand out.

Decoding strategies expressed as equilibrium states
Analyzing local normalization distortion effects
Quantifying distortion impact on text quality
🔎 Similar Papers
No similar papers found.