Social Ways: Learning Multi-Modal Distributions of Pedestrian Trajectories With GANs

Javad Amirian, Jean-Bernard Hayet, Julien Pettre; The IEEE Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, 2019, pp. 0-0

Abstract


This paper proposes a novel approach for predicting the motion of pedestrians interacting with others. It uses a Generative Adversarial Network (GAN) to sample plausible predictions for any agent in the scene. As GANs are very susceptible to mode collapsing and dropping, we show that the recently proposed Info-GAN allows dramatic improvements in multi-modal pedestrian trajectory prediction to avoid these issues. We also left out L2-loss in training the generator, unlike some previous works, because it causes serious mode collapsing though faster convergence. We show through experiments on real and synthetic data that the proposed method leads to generate more diverse samples and to preserve the modes of the predictive distribution. In particular, to prove this claim, we have designed a toy example dataset of trajectories that can be used to assess the performance of different methods in preserving the predictive distribution modes.

Related Material


[pdf]
[bibtex]
@InProceedings{Amirian_2019_CVPR_Workshops,
author = {Amirian, Javad and Hayet, Jean-Bernard and Pettre, Julien},
title = {Social Ways: Learning Multi-Modal Distributions of Pedestrian Trajectories With GANs},
booktitle = {The IEEE Conference on Computer Vision and Pattern Recognition (CVPR) Workshops},
month = {June},
year = {2019}
}