🤖 AI Summary
This work addresses the challenges neural networks face in handling extreme numerical values—such as overflow, underflow, and output instability—which often compromise structural consistency in algebraic operations. To mitigate this, the authors propose a fixed-length neural numerical embedding method that introduces, for the first time, a “neural isomorphic field”: a novel algebraic neural abstraction that preserves the additive, multiplicative, and order structures of the rational number field within the embedding space. Built upon the Transformer architecture, the model maps scalars to vectors and approximates algebraic operations in the resulting vector space. Experimental results demonstrate that addition satisfies key algebraic properties—including identity, closure, and associativity—with over 95% accuracy, while multiplication achieves 53%–73% accuracy, confirming the approach’s effectiveness in preserving algebraic structure and its potential for further refinement.
📝 Abstract
Neural network models often face challenges when processing very small or very large numbers due to issues such as overflow, underflow, and unstable output variations. To mitigate these problems, we propose using embedding vectors for numbers instead of directly using their raw values. These embeddings aim to retain essential algebraic properties while preventing numerical instabilities. In this paper, we introduce, for the first time, a fixed-length number embedding vector that preserves algebraic operations, including addition, multiplication, and comparison, within the field of rational numbers. We propose a novel Neural Isomorphic Field, a neural abstraction of algebraic structures such as groups and fields. The elements of this neural field are embedding vectors that maintain algebraic structure during computations. Our experiments demonstrate that addition performs exceptionally well, achieving over 95 percent accuracy on key algebraic tests such as identity, closure, and associativity. In contrast, multiplication exhibits challenges, with accuracy ranging from 53 percent to 73 percent across various algebraic properties. These findings highlight the model's strengths in preserving algebraic properties under addition while identifying avenues for further improvement in handling multiplication.