🤖 AI Summary
Existing neural algorithmic reasoning models for combinatorial optimization suffer from poor out-of-distribution (OOD) generalization due to softmax attention, which fails to capture convex polyhedral structures under the max-plus semiring.
Method: We propose Tropical Attention—the first attention variant grounded in tropical geometry and the max-plus semiring—that provably approximates dynamic-programming–based tropical circuits. Building upon this, we design the Tropical Transformer, neuro-compiling dynamic programming algorithms into max-plus algebraic structures and introducing adversarial robustness as a novel axis for generalization evaluation.
Results: Experiments demonstrate that our model significantly outperforms softmax-based baselines on length and numerical generalization tasks, while exhibiting strong OOD robustness, adversarial stability, and scale-invariant, sharp combinatorial reasoning capabilities.
📝 Abstract
Dynamic programming (DP) algorithms for combinatorial optimization problems work with taking maximization, minimization, and classical addition in their recursion algorithms. The associated value functions correspond to convex polyhedra in the max plus semiring. Existing Neural Algorithmic Reasoning models, however, rely on softmax-normalized dot-product attention where the smooth exponential weighting blurs these sharp polyhedral structures and collapses when evaluated on out-of-distribution (OOD) settings. We introduce Tropical attention, a novel attention function that operates natively in the max-plus semiring of tropical geometry. We prove that Tropical attention can approximate tropical circuits of DP-type combinatorial algorithms. We then propose that using Tropical transformers enhances empirical OOD performance in both length generalization and value generalization, on algorithmic reasoning tasks, surpassing softmax baselines while remaining stable under adversarial attacks. We also present adversarial-attack generalization as a third axis for Neural Algorithmic Reasoning benchmarking. Our results demonstrate that Tropical attention restores the sharp, scale-invariant reasoning absent from softmax.