🤖 AI Summary
Modeling long, cross-channel user behavior sequences and enabling multi-product collaborative recommendation in financial services face challenges including temporally heterogeneous interactions, strong inter-product dependencies, and trade-offs among multiple business objectives. Method: We propose the first unified Transformer-based sequential recommendation framework tailored for financial scenarios. It jointly models implicit and explicit behavioral signals, enables cross-product feature sharing and multi-task joint training, and incorporates a lightweight fine-tuning mechanism to dynamically balance competing business goals. Contribution/Results: Compared to production-grade tree-based baselines, our framework achieves statistically significant improvements in click-through rate (CTR) and conversion rate (CVR) across diverse financial products in both offline evaluations and online A/B tests. These results validate the effectiveness and scalability of Transformer architectures in noisy, low-frequency, sparse financial recommendation settings, advancing industry practice from traditional tree models toward sequence-aware deep modeling.
📝 Abstract
Transformer-based architectures are widely adopted in sequential recommendation systems, yet their application in Financial Services (FS) presents distinct practical and modeling challenges for real-time recommendation. These include:a) long-range user interactions (implicit and explicit) spanning both digital and physical channels generating temporally heterogeneous context, b) the presence of multiple interrelated products require coordinated models to support varied ad placements and personalized feeds, while balancing competing business goals. We propose FinTRec, a transformer-based framework that addresses these challenges and its operational objectives in FS. While tree-based models have traditionally been preferred in FS due to their explainability and alignment with regulatory requirements, our study demonstrate that FinTRec offers a viable and effective shift toward transformer-based architectures. Through historic simulation and live A/B test correlations, we show FinTRec consistently outperforms the production-grade tree-based baseline. The unified architecture, when fine-tuned for product adaptation, enables cross-product signal sharing, reduces training cost and technical debt, while improving offline performance across all products. To our knowledge, this is the first comprehensive study of unified sequential recommendation modeling in FS that addresses both technical and business considerations.