EnTransformer: A Deep Generative Transformer for Multivariate Probabilistic Forecasting

📅 2026-03-12
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of accurately modeling complex joint distributions in multivariate time series probabilistic forecasting by proposing EnTransformer, the first nonparametric approach that integrates the engression stochastic learning paradigm with the Transformer architecture. By injecting random noise and optimizing an energy score objective, EnTransformer directly learns the conditional predictive distribution without imposing parametric assumptions on its form, while preserving the Transformer’s capacity to capture long-range dependencies and cross-variable interactions. Evaluated on standard benchmarks—including Electricity, Traffic, and Solar—EnTransformer substantially outperforms existing methods, producing well-calibrated and coherent multivariate probabilistic forecast trajectories.

Technology Category

Application Category

📝 Abstract
Reliable uncertainty quantification is critical in multivariate time series forecasting problems arising in domains such as energy systems and transportation networks, among many others. Although Transformer-based architectures have recently achieved strong performance for sequence modeling, most probabilistic forecasting approaches rely on restrictive parametric likelihoods or quantile-based objectives. They can struggle to capture complex joint predictive distributions across multiple correlated time series. This work proposes EnTransformer, a deep generative forecasting framework that integrates engression, a stochastic learning paradigm for modeling conditional distributions, with the expressive sequence modeling capabilities of Transformers. The proposed approach injects stochastic noise into the model representation and optimizes an energy-based scoring objective to directly learn the conditional predictive distribution without imposing parametric assumptions. This design enables EnTransformer to generate coherent multivariate forecast trajectories while preserving Transformers' capacity to effectively model long-range temporal dependencies and cross-series interactions. We evaluate our proposed EnTransformer on several widely used benchmarks for multivariate probabilistic forecasting, including Electricity, Traffic, Solar, Taxi, KDD-cup, and Wikipedia datasets. Experimental results demonstrate that EnTransformer produces well-calibrated probabilistic forecasts and consistently outperforms the benchmark models.
Problem

Research questions and friction points this paper is trying to address.

multivariate probabilistic forecasting
uncertainty quantification
joint predictive distributions
time series forecasting
conditional distribution modeling
Innovation

Methods, ideas, or system contributions that make the work stand out.

EnTransformer
multivariate probabilistic forecasting
engression
energy-based scoring
non-parametric conditional distribution
🔎 Similar Papers
No similar papers found.
R
Rajdeep Pathak
SAFIR, Sorbonne University Abu Dhabi, UAE
R
Rahul Goswami
Indian Institute of Technology, Guwahati 781039, India
M
Madhurima Panja
SAFIR, Sorbonne University Abu Dhabi, UAE
P
Palash Ghosh
Indian Institute of Technology, Guwahati 781039, India
Tanujit Chakraborty
Tanujit Chakraborty
Associate Professor of Statistics and Data Science at Sorbonne University
Machine LearningTime Series ForecastingSpatial StatisticsHealth Data Science