🤖 AI Summary
This study investigates the failure mechanisms in masked diffusion models arising from sequential generation and parallel decoding. Addressing issues of order sensitivity and parallelization bias, the work proposes a unified information-theoretic framework that reveals how factorized parallel decoding induces uncontrolled Reverse KL divergence. It further elucidates the superiority of Easy-First strategies under high-error conditions. Leveraging information-theoretic tools such as total correlation, combined with controlled experiments in a Block-HMM environment and large-scale LLaDA model evaluations on arithmetic reasoning tasks, the analysis theoretically and empirically validates the intrinsic limitations of parallel decoding and delineates the effective boundaries of corrective strategies. The study quantifies the exponential computational cost incurred by these underlying mechanisms.
📝 Abstract
Masked Diffusion Models (MDMs) significantly accelerate inference by trading off sequential determinism. However, the theoretical mechanisms governing generation order and the risks inherent in parallelization remain under-explored. In this work, we provide a unified information-theoretic framework to decouple and analyze two fundamental sources of failure: order sensitivity and parallelization bias. Our analysis yields three key insights: (1) The benefits of Easy-First decoding (prioritizing low-entropy tokens) are magnified as model error increases; (2) factorized parallel decoding introduces intrinsic sampling errors that can lead to arbitrary large Reverse KL divergence, capturing"incoherence"failures that standard Forward KL metrics overlook; and (3) while verification can eliminate sampling error, it incurs an exponential cost governed by the total correlation within a block. Conversely, heuristics like remasking, though computationally efficient, cannot guarantee distributional correctness. Experiments on a controlled Block-HMM and large-scale MDMs (LLaDA) for arithmetic reasoning validate our theoretical framework.