🤖 AI Summary
The DNA foundation model community lacks systematic, benchmark-driven comparisons between CNNs and Transformer/SSM-based architectures.
Method: We propose ConvNova, a lightweight and efficient CNN architecture that uniquely integrates dilated convolutions, gated convolutions, and a dual-branch gating mechanism—explicitly eschewing self-attention and state-space modeling.
Contribution/Results: ConvNova achieves strong biological interpretability in epigenetic prediction tasks and outperforms state-of-the-art methods on over half of the evaluated benchmarks: it improves histone modification prediction accuracy by 5.8% on average, reduces parameter count by 37%, and accelerates inference by 2.1×. These results demonstrate that carefully engineered CNNs remain highly competitive for DNA sequence modeling, offering a novel, efficient, and interpretable paradigm for DNA foundation models.
📝 Abstract
In recent years, a variety of methods based on Transformer and state space model (SSM) architectures have been proposed, advancing foundational DNA language models. However, there is a lack of comparison between these recent approaches and the classical architecture convolutional networks (CNNs) on foundation model benchmarks. This raises the question: are CNNs truly being surpassed by these recent approaches based on transformer and SSM architectures? In this paper, we develop a simple but well-designed CNN-based method termed ConvNova. ConvNova identifies and proposes three effective designs: 1) dilated convolutions, 2) gated convolutions, and 3) a dual-branch framework for gating mechanisms. Through extensive empirical experiments, we demonstrate that ConvNova significantly outperforms recent methods on more than half of the tasks across several foundation model benchmarks. For example, in histone-related tasks, ConvNova exceeds the second-best method by an average of 5.8%, while generally utilizing fewer parameters and enabling faster computation. In addition, the experiments observed findings that may be related to biological characteristics. This indicates that CNNs are still a strong competitor compared to Transformers and SSMs. We anticipate that this work will spark renewed interest in CNN-based methods for DNA foundation models.