A Simple Approximate Bayesian Inference Neural Surrogate for Stochastic Petri Net Models

๐Ÿ“… 2025-07-14
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
Parameter inference for stochastic Petri nets (SPNs) with external covariates remains challenging under partial observability, event missingness, and measurement noise, especially when the likelihood function is intractable. Method: We propose an approximate Bayesian inference framework based on a lightweight 1D convolutional residual network, the first to serve as a posterior surrogate for SPNs. Integrated with Monte Carlo Dropout, it enables calibrated uncertainty quantification without explicit likelihood evaluation. The network is trained end-to-end on Gillespie-simulated trajectories, incorporating noise augmentation and explicit event-missingness modeling. Contribution/Results: Our method directly infers covariate-dependent rate function coefficients from noisy, sparse token trajectories. In synthetic experiments with 20% missing events, it achieves a coefficient estimation RMSE of 0.108 and significantly outperforms conventional Bayesian methods in inference speedโ€”enabling real-time, high-accuracy parameter estimation for complex discrete-event systems.

Technology Category

Application Category

๐Ÿ“ Abstract
Stochastic Petri Nets (SPNs) are an increasingly popular tool of choice for modeling discrete-event dynamics in areas such as epidemiology and systems biology, yet their parameter estimation remains challenging in general and in particular when transition rates depend on external covariates and explicit likelihoods are unavailable. We introduce a neural-surrogate (neural-network--based approximation of the posterior distribution) framework that predicts the coefficients of known covariate-dependent rate functions directly from noisy, partially observed token trajectories. Our model employs a lightweight 1D Convolutional Residual Network trained end-to-end on Gillespie-simulated SPN realizations, learning to invert system dynamics under realistic conditions of event dropout. During inference, Monte Carlo dropout provides calibrated uncertainty bounds together with point estimates. On synthetic SPNs with 20% missing events, our surrogate recovers rate-function coefficients with an RMSE = 0.108 and substantially runs faster than traditional Bayesian approaches. These results demonstrate that data-driven, likelihood-free surrogates can enable accurate, robust, and real-time parameter recovery in complex, partially observed discrete-event systems.
Problem

Research questions and friction points this paper is trying to address.

Estimating parameters in Stochastic Petri Nets with covariate-dependent rates
Handling noisy, partially observed data in discrete-event systems
Providing fast, accurate Bayesian inference without explicit likelihoods
Innovation

Methods, ideas, or system contributions that make the work stand out.

Neural-network surrogate for posterior approximation
1D Convolutional Residual Network training
Monte Carlo dropout for uncertainty calibration
๐Ÿ”Ž Similar Papers
No similar papers found.
B
Bright Kwaku Manu
School of Computing and Augmented Intelligence, Arizona State University, Tempe, 85281
T
Trevor Reckell
School of Mathematical and Statistical Sciences, Arizona State University, Tempe, 85281
Beckett Sterner
Beckett Sterner
Arizona State University
PhilosophyBiodiversityData Science
Petar Jevtic
Petar Jevtic
Associate Professor, School of Mathematical and Statistical Sciences, Arizona State University
Actuarial ScienceLongevity RiskMathematical Finance