Retrieval of Temporal Event Sequences from Textual Descriptions

📅 2024-10-17
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses text-driven Temporal Event Sequence Retrieval (TESR), aiming to accurately locate temporally ordered event sequences from natural language queries—enabling applications in e-commerce behavior analysis, social media monitoring, and criminal investigation. To advance this task, we introduce TESRBench, the first comprehensive benchmark encompassing diverse real-world, multi-source scenarios. We propose TPP-Embedding, the first unified framework integrating Large Language Models (LLMs) with Temporal Point Processes (TPPs), termed TPP-LLM. It incorporates sequence-level text-temporal joint contrastive learning, event-sequence pooling embeddings, and a hybrid text generation strategy combining synthetic data synthesis with human verification. Evaluated across the full TESRBench dataset, our method significantly outperforms existing baselines, achieving state-of-the-art performance in both retrieval accuracy and robustness.

Technology Category

Application Category

📝 Abstract
Retrieving temporal event sequences from textual descriptions is crucial for applications such as analyzing e-commerce behavior, monitoring social media activities, and tracking criminal incidents. To advance this task, we introduce TESRBench, a comprehensive benchmark for temporal event sequence retrieval (TESR) from textual descriptions. TESRBench includes diverse real-world datasets with synthesized and reviewed textual descriptions, providing a strong foundation for evaluating retrieval performance and addressing challenges in this domain. Building on this benchmark, we propose TPP-Embedding, a novel model for embedding and retrieving event sequences. The model leverages the TPP-LLM framework, integrating large language models (LLMs) with temporal point processes (TPPs) to encode both event texts and times. By pooling representations and applying a contrastive loss, it unifies temporal dynamics and event semantics in a shared embedding space, aligning sequence-level embeddings of event sequences and their descriptions. TPP-Embedding demonstrates superior performance over baseline models across TESRBench datasets, establishing it as a powerful solution for the temporal event sequence retrieval task.
Problem

Research questions and friction points this paper is trying to address.

Temporal Event Extraction
Online Shopping Habits
Social Media Dynamics
Innovation

Methods, ideas, or system contributions that make the work stand out.

TESRBench
TPP-Embedding
Temporal Event Processing
🔎 Similar Papers
No similar papers found.
Z
Zefang Liu
Georgia Institute of Technology, Atlanta, GA 30332, USA
Yinzhu Quan
Yinzhu Quan
Georgia Institute of Technology