🤖 AI Summary
Optimizing implication bases in convex geometries—i.e., minimizing both premises and conclusions—is generally computationally hard. This work introduces a novel tool, the *quasi-closed hypergraph*, to characterize the structure of optimal implication bases and provides a unified treatment for four important classes of convex geometries: double-shell, acyclic, affine, and acceptable. By integrating hypergraph theory, implication base reduction algorithms, and structural properties of convex geometries, we prove that when the edges of the quasi-closed hypergraph are pairwise disjoint, any implication base can be optimized in polynomial time. Notably, this condition precisely encompasses the aforementioned four classes, thereby establishing a unified theoretical framework that explains their tractability and enables efficient optimization.
📝 Abstract
Optimizing an implicational base of a closure system consists in turning this implicational base into an equivalent one with premises and conclusions as small as possible. This task is known to be hard in general but tractable for a number of classes of closure systems. In particular, several classes of convex geometries are known to have tractable optimization, while the problem was recently claimed to remain hard in general convex geometries. Continuing this line of research, we give a characterization of the optimum bases of a convex geometry in terms of what we call quasi-closed hypergraphs. We then use this characterization to show that when each quasi-closed hypergraph has disjoint edges, any implicational base of the convex geometry can be optimized in polynomial time with existing minimization and reduction algorithms. Finally, we prove that this property applies to double-shelling, acyclic, affine and acceptant convex geometries, thus unifying the existing results regarding the tractability of optimization for the first three classes.