A Plug-and-Play Bregman ADMM Module for Inferring Event Branches in Temporal Point Processes

📅 2025-01-08
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenges of modeling event causality and ensuring interpretability in temporal point processes (TPPs). We propose a differentiable Bregman ADMM (BADMM) module that formulates event transition matrix estimation as a constrained optimization problem with joint sparsity and low-rank regularization. Crucially, we pioneer the unrolling of Bregman ADMM into a differentiable, plug-and-play neural component, enabling seamless integration into both expectation-maximization (EM) frameworks and self-attention mechanisms. Our method synergistically combines sparse group Lasso and subspace clustering to infer structured event branching patterns. Evaluated on synthetic and real-world datasets, it significantly improves TPP performance while producing interpretable responsibility matrices and sparse–low-rank attention maps. These outputs enable precise identification of isolated events and critical triggering events, achieving a principled balance between modeling accuracy and causal interpretability.

Technology Category

Application Category

📝 Abstract
An event sequence generated by a temporal point process is often associated with a hidden and structured event branching process that captures the triggering relations between its historical and current events. In this study, we design a new plug-and-play module based on the Bregman ADMM (BADMM) algorithm, which infers event branches associated with event sequences in the maximum likelihood estimation framework of temporal point processes (TPPs). Specifically, we formulate the inference of event branches as an optimization problem for the event transition matrix under sparse and low-rank constraints, which is embedded in existing TPP models or their learning paradigms. We can implement this optimization problem based on subspace clustering and sparse group-lasso, respectively, and solve it using the Bregman ADMM algorithm, whose unrolling leads to the proposed BADMM module. When learning a classic TPP (e.g., Hawkes process) by the expectation-maximization algorithm, the BADMM module helps derive structured responsibility matrices in the E-step. Similarly, the BADMM module helps derive low-rank and sparse attention maps for the neural TPPs with self-attention layers. The structured responsibility matrices and attention maps, which work as learned event transition matrices, indicate event branches, e.g., inferring isolated events and those key events triggering many subsequent events. Experiments on both synthetic and real-world data show that plugging our BADMM module into existing TPP models and learning paradigms can improve model performance and provide us with interpretable structured event branches. The code is available at url{https://github.com/qingmeiwangdaily/BADMM_TPP}.
Problem

Research questions and friction points this paper is trying to address.

Causal Inference
Event Sequences
Temporal Patterns
Innovation

Methods, ideas, or system contributions that make the work stand out.

Bregman ADMM
Event Causal Triggering
Sparse Group Lasso
Q
Qingmei Wang
Gaoling School of Artificial Intelligence, Renmin University of China
Y
Yuxin Wu
Gaoling School of Artificial Intelligence, Renmin University of China
Y
Yujie Long
School of Cyber Science and Engineering, Wuhan University
J
Jing Huang
School of Cyber Science and Engineering, Wuhan University
Fengyuan Ran
Fengyuan Ran
武汉大学
B
Bing Su
Gaoling School of Artificial Intelligence, Renmin University of China, Beijing Key Laboratory of Big Data Management and Analysis Methods
H
Hongteng Xu
Gaoling School of Artificial Intelligence, Renmin University of China, Beijing Key Laboratory of Big Data Management and Analysis Methods