🤖 AI Summary
This work addresses the challenge of accurately predicting the next app launch in scenarios characterized by sparse or absent user behavior—such as cold-start settings—and rapidly shifting intents within sessions. We propose a user-profile-free, multi-hop intent-aware session graph learning framework that captures both local and global app transition dependencies by constructing multi-hop session graphs. The model integrates spatiotemporal context with recent interactions to dynamically model intent evolution. Innovatively, it introduces a hop-level attention mechanism coupled with lightweight graph propagation to effectively capture high-order transition relationships without relying on historical user data, achieving both interpretability and computational efficiency. Experiments on two real-world datasets demonstrate that our method significantly outperforms existing approaches under both standard and cold-start evaluation settings, while maintaining high accuracy and low computational overhead.
📝 Abstract
Predicting the next mobile app a user will launch is essential for proactive mobile services. Yet accurate prediction remains challenging in real-world settings, where user intent can shift rapidly within short sessions and user-specific historical profiles are often sparse or unavailable, especially under cold-start conditions. Existing approaches mainly model app usage as sequential behavior or local session transitions, limiting their ability to capture higher-order structural dependencies and evolving session intent. To address this issue, we propose MISApp, a profile-free framework for next app prediction based on multi-hop session graph learning. MISApp constructs multi-hop session graphs to capture transition dependencies at different structural ranges, learns session representations through lightweight graph propagation, incorporates temporal and spatial context to characterize session conditions, and captures intent evolution from recent interactions. Experiments on two real-world app usage datasets show that MISApp consistently outperforms competitive baselines under both standard and cold-start settings, while maintaining a favorable balance between predictive accuracy and practical efficiency. Further analyses show that the learned hop-level attention weights align well with structural relevance, offering interpretable evidence for the effectiveness of the proposed multi-hop modeling strategy.