Neuro-Symbolic Predictive Process Monitoring

📅 2025-08-31
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the critical issue of logical inconsistency in suffix prediction for business process management (BPM), this paper proposes a neuro-symbolic integration method that, for the first time, embeds finite-trace linear temporal logic (LTLf) into an autoregressive sequence prediction framework via a differentiable logic loss function. By combining soft approximations of LTLf semantics with the Gumbel-Softmax reparameterization, we design two differentiable logic loss variants—local and global—that are jointly optimized with standard prediction loss in an end-to-end trainable architecture. Evaluated on three real-world BPM datasets, our approach significantly improves both predictive accuracy and adherence to temporal constraints compared to purely data-driven baselines. Moreover, it demonstrates strong robustness against observational noise and real-world process variations. This work establishes a novel paradigm for compliance-sensitive BPM applications, uniquely balancing prediction fidelity with formal interpretability.

Technology Category

Application Category

📝 Abstract
This paper addresses the problem of suffix prediction in Business Process Management (BPM) by proposing a Neuro-Symbolic Predictive Process Monitoring (PPM) approach that integrates data-driven learning with temporal logic-based prior knowledge. While recent approaches leverage deep learning models for suffix prediction, they often fail to satisfy even basic logical constraints due to the absence of explicit integration of domain knowledge during training. We propose a novel method to incorporate Linear Temporal Logic over finite traces (LTLf) into the training process of autoregressive sequence predictors. Our approach introduces a differentiable logical loss function, defined using a soft approximation of LTLf semantics and the Gumbel-Softmax trick, which can be combined with standard predictive losses. This ensures the model learns to generate suffixes that are both accurate and logically consistent. Experimental evaluation on three real-world datasets shows that our method improves suffix prediction accuracy and compliance with temporal constraints. We also introduce two variants of the logic loss (local and global) and demonstrate their effectiveness under noisy and realistic settings. While developed in the context of BPM, our framework is applicable to any symbolic sequence generation task and contributes toward advancing Neuro-Symbolic AI.
Problem

Research questions and friction points this paper is trying to address.

Predicting process suffixes with logical constraints
Integrating temporal logic into neural network training
Ensuring generated sequences are accurate and consistent
Innovation

Methods, ideas, or system contributions that make the work stand out.

Neuro-Symbolic approach integrating deep learning with temporal logic
Differentiable logical loss function using soft LTLf approximation
Local and global logic loss variants ensuring logical consistency
🔎 Similar Papers
No similar papers found.
A
Axel Mezini
Faculty of Engineering, Free University of Bozen-Bolzano, NOI Techpark - via Bruno Buozzi, 1, Bolzano, 39100, Italy
E
Elena Umili
Department of Computer, Control and Management Engineering, Sapienza, Università di Roma, Via Ariosto, 25, Roma, 00185, Italy
Ivan Donadello
Ivan Donadello
Free university of Bozen-Bolzano
Neuro-Symbolic IntegrationProcess Mining
F
Fabrizio Maria Maggi
Faculty of Engineering, Free University of Bozen-Bolzano, NOI Techpark - via Bruno Buozzi, 1, Bolzano, 39100, Italy
M
Matteo Mancanelli
Department of Computer, Control and Management Engineering, Sapienza, Università di Roma, Via Ariosto, 25, Roma, 00185, Italy
Fabio Patrizi
Fabio Patrizi
Associate Professor, DIAG - Sapienza University of Rome
Artificial IntelligenceVerification & SynthesisService Oriented Computing