🤖 AI Summary
This paper investigates the capacity of communication over adversarial time-varying channels under a sliding-window power constraint. In contrast to conventional models—where unique-decoding capacity is inherently limited and list decoding is required—we propose, for the first time, a sliding-window power constraint: both the encoder and the adversary must satisfy a given power bound almost surely over every contiguous subsequence of length $w$. Using random coding arguments and probabilistic constraint analysis, we rigorously characterize the achievable rates and derive tight converse bounds under this model. Our key contribution is proving that, in several canonical adversarial settings, the unique-decoding capacity under the sliding-window constraint equals the list-decoding capacity of the standard (block-power) model—establishing a “list-to-unique” dimensional reduction equivalence. This result fundamentally bridges the performance gap between list and unique decoding, significantly surpassing the traditional unique-decoding capacity limits.
📝 Abstract
In an arbitrarily varying channel (AVC), the channel has a state which is under the control of an adversarial jammer and the corresponding capacities are often functions of the"power"constraints on the transmitter and jammer. In this paper we propose a model in which the constraints must hold almost surely over contiguous subsequences of the codeword and state, which we call a sliding window constraint. We study oblivious jammers and codes with stochastic encoding under maximum probability of error. We show that this extra limitation on the jammer is beneficial for the transmitter: in some cases, the capacity for unique decoding with a sliding window constraint is equal to the capacity for list decoding in the standard model without sliding windows, roughly implying that the addition of window constraints reduces list decoding to unique decoding. The list decoding capacity in the standard model can be strictly larger than the unique decoding capacity.