Adaptive Grid-Based Thompson Sampling for Efficient Trajectory Discovery

📅 2025-10-20
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Conventional Bayesian optimization methods infer parameters from summary statistics (e.g., means or quantiles) of stochastic simulation outputs, discarding trajectory-level structural information. Method: We propose a trajectory-level Bayesian optimization framework that jointly models input parameters and random seeds via a Gaussian process surrogate; introduces a trajectory-level likelihood function; leverages common random numbers (CRN) to ensure output comparability; and employs an adaptive grid Thompson sampling strategy, combining likelihood-based filtering with Metropolis–Hastings densification for dynamic input-space refinement. Contribution/Results: Evaluated on a compartmental model and an agent-based epidemiological model, our approach significantly improves identification efficiency of observation-consistent trajectories. Compared to standard parameter-level inference, it achieves higher sampling efficiency and faster convergence while preserving temporal dynamics and stochastic structure.

Technology Category

Application Category

📝 Abstract
Bayesian optimization (BO) is a powerful framework for estimating parameters of computationally expensive simulation models, particularly in settings where the likelihood is intractable and evaluations are costly. In stochastic models every simulation is run with a specific parameter set and an implicit or explicit random seed, where each parameter set and random seed combination generates an individual realization, or trajectory, sampled from an underlying random process. Existing BO approaches typically rely on summary statistics over the realizations, such as means, medians, or quantiles, potentially limiting their effectiveness when trajectory-level information is desired. We propose a trajectory-oriented Bayesian optimization method that incorporates a Gaussian process (GP) surrogate using both input parameters and random seeds as inputs, enabling direct inference at the trajectory level. Using a common random number (CRN) approach, we define a surrogate-based likelihood over trajectories and introduce an adaptive Thompson Sampling algorithm that refines a fixed-size input grid through likelihood-based filtering and Metropolis-Hastings-based densification. This approach concentrates computation on statistically promising regions of the input space while balancing exploration and exploitation. We apply the method to stochastic epidemic models, a simple compartmental and a more computationally demanding agent-based model, demonstrating improved sampling efficiency and faster identification of data-consistent trajectories relative to parameter-only inference.
Problem

Research questions and friction points this paper is trying to address.

Optimizes stochastic models using trajectory-level Bayesian inference
Improves efficiency in identifying data-consistent parameter trajectories
Balances exploration-exploitation via adaptive Thompson Sampling algorithm
Innovation

Methods, ideas, or system contributions that make the work stand out.

GP surrogate with parameters and random seeds
Adaptive Thompson Sampling with likelihood filtering
Metropolis-Hastings densification for input grid refinement
🔎 Similar Papers
No similar papers found.