Weighted Tensor Decompositions for Context-aware Collaborative Filtering

📅 2025-03-11
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the challenge of modeling user interests under dynamic contextual factors (e.g., time, weather, mood), this paper proposes a high-order tensor modeling framework that jointly encodes user–item interactions and heterogeneous contextual signals into third- or higher-order tensors. We design weighted tensor decomposition methods—extensions of CP, Tucker, and Tensor-Train decompositions—optimized under a weighted squared loss with structured regularization (low-rank, sparsity, and ℓ₂). Our key contributions are: (i) a systematic taxonomy of structure-regularized decomposition variants, filling a methodological gap; and (ii) a multi-dimensional evaluation framework balancing computational complexity, scalability, and modeling expressiveness. Experiments on multiple public datasets demonstrate significant improvements in recommendation accuracy and contextual adaptability. Moreover, the framework enables adaptive selection of the optimal decomposition strategy based on data scale, sparsity, and real-time requirements.

Technology Category

Application Category

📝 Abstract
Over recent years it has become well accepted that user interest is not static or immutable. There are a variety of contextual factors, such as time of day, the weather or the user's mood, that influence the current interests of the user. Modelling approaches need to take these factors into account if they want to succeed at finding the most relevant content to recommend given the situation. A popular method for context-aware recommendation is to encode context attributes as extra dimensions of the classic user-item interaction matrix, effectively turning it into a tensor, followed by applying the appropriate tensor decomposition methods to learn missing values. However, unlike with matrix factorization, where all decompositions are essentially a product of matrices, there exist many more options for decomposing tensors by combining vector, matrix and tensor products. We study the most successful decomposition methods that use weighted square loss and categorize them based on their tensor structure and regularization strategy. Additionally, we further extend the pool of methods by filling in the missing combinations. In this paper we provide an overview of the properties of the different decomposition methods, such as their complexity, scalability, and modelling capacity. These benefits are then contrasted with the performances achieved in offline experiments to gain more insight into which method to choose depending on a specific situation and constraints.
Problem

Research questions and friction points this paper is trying to address.

Modeling dynamic user interests influenced by contextual factors.
Exploring tensor decomposition methods for context-aware recommendations.
Evaluating decomposition methods' complexity, scalability, and performance.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Weighted tensor decompositions for context-aware recommendations
Extends tensor decomposition methods with new combinations
Analyzes complexity, scalability, and modeling capacity
🔎 Similar Papers
No similar papers found.