Hyena Operator for Fast Sequential Recommendation

📅 2026-03-26
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work proposes HyenaRec, a novel sequence recommendation model that addresses the high computational complexity and the trade-off between efficiency and accuracy in existing attention-based approaches when handling long user histories. By introducing Legendre orthogonal polynomial-parameterized convolutional kernels into recommender systems for the first time, HyenaRec integrates a gating mechanism with Hyena operators to construct a hybrid architecture with linear complexity. This design effectively captures both short- and long-term user behavioral patterns. The proposed method significantly enhances representational capacity and inference efficiency on sparse, long sequences, outperforming state-of-the-art baselines across multiple real-world datasets. Notably, HyenaRec achieves up to a 6× speedup in training while maintaining high recommendation accuracy, particularly in long-sequence scenarios.

Technology Category

Application Category

📝 Abstract
Sequential recommendation models, particularly those based on attention, achieve strong accuracy but incur quadratic complexity, making long user histories prohibitively expensive. Sub-quadratic operators such as Hyena provide efficient alternatives in language modeling, but their potential in recommendation remains underexplored. We argue that Hyena faces challenges in recommendation due to limited representation capacity on sparse, long user sequences. To address these challenges, we propose HyenaRec, a novel sequential recommender that integrates polynomial-based kernel parameterization with gated convolutions. Specifically, we design convolutional kernels using Legendre orthogonal polynomials, which provides a smooth and compact basis for modeling long-term temporal dependencies. A complementary gating mechanism captures fine-grained short-term behavioral bursts, yielding a hybrid architecture that balances global temporal evolution with localized user interests under sparse feedback. This construction enhances expressiveness while scaling linearly with sequence length. Extensive experiments on multiple real-world datasets demonstrate that HyenaRec consistently outperforms Attention-, Recurrent-, and other baselines in ranking accuracy. Moreover, it trains significantly faster (up to 6x speedup), with particularly pronounced advantages on long-sequence scenarios where efficiency is maintained without sacrificing accuracy. These results highlight polynomial-based kernel parameterization as a principled and scalable alternative to attention for sequential recommendation.
Problem

Research questions and friction points this paper is trying to address.

sequential recommendation
quadratic complexity
long user sequences
sparse feedback
representation capacity
Innovation

Methods, ideas, or system contributions that make the work stand out.

HyenaRec
polynomial-based kernel
gated convolution
sequential recommendation
sub-quadratic complexity
🔎 Similar Papers
No similar papers found.
Jiahao Liu
Jiahao Liu
Huazhong University of Science and Technology (HUST)
image processingsuper-resolution microscopy
Lin Li
Lin Li
School of Mathematics and Statistics, Chongqing Technology and Business University
Nonlinear AnalysisPartial Differential EquationsVariational MethodsCritical Point TheoryPDEs
Z
Zhiyuan Li
Wuhan University of Technology
K
Kaixi Hu
Wuhan Textile University
K
Kaize Shi
University of Southern Queensland
J
Jingling Yuan
Hubei Key Laboratory of Transportation Internet of Things, Wuhan University of Technology