🤖 AI Summary
This work addresses the cross-prime prediction problem for Frobenius traces of elliptic curves: given the trace $a_q$ (or $a_q mod 2$) at a prime $q$, predict $a_p$ (or $a_p mod 2$) at another prime $p$. We propose the first purely data-driven, deep learning approach—combining Transformers with feed-forward networks—to learn patterns in Euler factors without explicit number-theoretic computation. To enhance generalization, we introduce modular-reduction-based feature engineering and cross-modular training. The model achieves high accuracy on both $a_p$ regression and $a_p mod 2$ binary classification. Experiments demonstrate that deep models can effectively capture latent structural information encoded in $L$-functions, and reveal nontrivial cross-prime predictability of parity-constrained traces. Interpretability analysis shows learned patterns align with classical number-theoretic intuitions—including the Hasse bound and local symmetry—offering a novel, interpretable, data-driven perspective for arithmetic geometry.
📝 Abstract
We apply transformer models and feedforward neural networks to predict Frobenius traces $a_p$ from elliptic curves given other traces $a_q$. We train further models to predict $a_p mod 2$ from $a_q mod 2$, and cross-analysis such as $a_p mod 2$ from $a_q$. Our experiments reveal that these models achieve high accuracy, even in the absence of explicit number-theoretic tools like functional equations of $L$-functions. We also present partial interpretability findings.