AdSight: Scalable and Accurate Quantification of User Attention in Multi-Slot Sponsored Search

📅 2025-04-30
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the challenge of accurately quantifying user attention on multi-slot search engine results pages (SERPs), this paper proposes the first Transformer-based sequence-to-sequence attention prediction framework specifically designed for multi-slot SERPs. The method decouples mouse trajectory modeling from slot-level semantic awareness, jointly embedding trajectory dynamics and slot-specific features, and simultaneously optimizes two complementary tasks: regression of gaze duration/fixation count and binary classification of slot attention. This dual-objective design significantly enhances cross-layout generalization. Evaluated on a large-scale real-world SERP dataset, the model reduces mean absolute error (MAE) in gaze time prediction by 32% and improves F1-score for slot attention detection by 19%. To our knowledge, this is the first work achieving high-accuracy, scalable attention modeling across multiple SERP slots, providing an interpretable and production-ready foundation for ad slot optimization and revenue strategy.

Technology Category

Application Category

📝 Abstract
Modern Search Engine Results Pages (SERPs) present complex layouts where multiple elements compete for visibility. Attention modelling is crucial for optimising web design and computational advertising, whereas attention metrics can inform ad placement and revenue strategies. We introduce AdSight, a method leveraging mouse cursor trajectories to quantify in a scalable and accurate manner user attention in multi-slot environments like SERPs. AdSight uses a novel Transformer-based sequence-to-sequence architecture where the encoder processes cursor trajectory embeddings, and the decoder incorporates slot-specific features, enabling robust attention prediction across various SERP layouts. We evaluate our approach on two Machine Learning tasks: (1)~emph{regression}, to predict fixation times and counts; and (2)~emph{classification}, to determine some slot types were noticed. Our findings demonstrate the model's ability to predict attention with unprecedented precision, offering actionable insights for researchers and practitioners.
Problem

Research questions and friction points this paper is trying to address.

Quantify user attention in multi-slot search results
Predict fixation times and counts accurately
Determine noticed slot types in SERP layouts
Innovation

Methods, ideas, or system contributions that make the work stand out.

Transformer-based sequence-to-sequence architecture for attention prediction
Mouse cursor trajectories to quantify user attention
Scalable and accurate attention modeling in SERPs
🔎 Similar Papers
No similar papers found.