Explicit Entropic Constructions for Coverage, Facility Location, and Graph Cuts

๐Ÿ“… 2026-01-19
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
This work investigates whether commonly used monotone submodular functions in practice belong to the class of entropic polyhedraโ€”that is, whether they can be explicitly represented as Shannon entropy. By constructing suitable random variables and employing truncation techniques, the authors establish, for the first time, exact Shannon entropy representations for several classical submodular functions, including coverage, facility location, saturated coverage, concave combinations of modular functions, and monotone graph cuts. This result reveals a direct connection between widely adopted objective functions in combinatorial optimization and classical information theory, demonstrating that these functions can be rigorously embedded within an information-theoretic framework. Consequently, concepts such as submodular mutual information and conditional gains naturally reduce to standard information-theoretic quantities, thereby unifying submodular optimization with information measures.

Technology Category

Application Category

๐Ÿ“ Abstract
Shannon entropy is a polymatroidal set function and lies at the foundation of information theory, yet the class of entropic polymatroids is strictly smaller than the class of all submodular functions. In parallel, submodular and combinatorial information measures (SIMs) have recently been proposed as a principled framework for extending entropy, mutual information, and conditional mutual information to general submodular functions, and have been used extensively in data subset selection, active learning, domain adaptation, and representation learning. This raises a natural and fundamental question: are the monotone submodular functions most commonly used in practice entropic? In this paper, we answer this question in the affirmative for a broad class of widely used polymatroid functions. We provide explicit entropic constructions for set cover and coverage functions, facility location, saturated coverage, concave-over-modular functions via truncations, and monotone graph-cut-type objectives. Our results show that these functions can be realized exactly as Shannon entropies of appropriately constructed random variables. As a consequence, for these functions, submodular mutual information coincides with classical mutual information, conditional gain specializes to conditional entropy, and submodular conditional mutual information reduces to standard conditional mutual information in the entropic sense. These results establish a direct bridge between combinatorial information measures and classical information theory for many of the most common submodular objectives used in applications.
Problem

Research questions and friction points this paper is trying to address.

entropic
submodular functions
Shannon entropy
combinatorial information measures
monotone submodular
Innovation

Methods, ideas, or system contributions that make the work stand out.

entropic construction
submodular functions
Shannon entropy
combinatorial information measures
mutual information
๐Ÿ”Ž Similar Papers
No similar papers found.