🤖 AI Summary
To address the limitations of existing methods in modeling multimodality and asymmetric distributions—leading to insufficient prediction diversity in pedestrian trajectory forecasting—this paper proposes the Mixture Gaussian Flow (MGF) model. MGF is the first approach to construct a multimodal, asymmetric mixture-of-Gaussians prior in an unsupervised manner via trajectory pattern analysis, without requiring additional annotations; it replaces the conventional unimodal Gaussian prior, thereby overcoming the expressive bottleneck of normalization flow models in prior distribution modeling. By integrating latent-space clustering, differentiable prior learning, and mixture Gaussian modeling, MGF simultaneously improves both predictive accuracy (ADE/FDE) and diversity (MR, FOS) on the UCY/ETH and SDD benchmarks, achieving state-of-the-art performance.
📝 Abstract
To predict future trajectories, the normalizing flow with a standard Gaussian prior suffers from weak diversity. The ineffectiveness comes from the conflict between the fact of asymmetric and multi-modal distribution of likely outcomes and symmetric and single-modal original distribution and supervision losses. Instead, we propose constructing a mixed Gaussian prior for a normalizing flow model for trajectory prediction. The prior is constructed by analyzing the trajectory patterns in the training samples without requiring extra annotations while showing better expressiveness and being multi-modal and asymmetric. Besides diversity, it also provides better controllability for probabilistic trajectory generation. We name our method Mixed Gaussian Flow (MGF). It achieves state-of-the-art performance in the evaluation of both trajectory alignment and diversity on the popular UCY/ETH and SDD datasets. Code is available at https://github.com/mulplue/MGF.