Your Local GAN: Designing Two Dimensional Local Attention Mechanisms for Generative Models

Giannis Daras, Augustus Odena, Han Zhang, Alexandros G. Dimakis; Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2020, pp. 14531-14539

Abstract


We introduce a new local sparse attention layer that preserves two-dimensional geometry and locality. We show that by just replacing the dense attention layer of SAGAN with our construction, we obtain very significant FID, Inception score and pure visual improvements. FID score is improved from 18.65 to 15.94 on ImageNet, keeping all other parameters the same. The sparse attention patterns that we propose for our new layer are designed using a novel information theoretic criterion that uses information flow graphs. We also present a novel way to invert Generative Adversarial Networks with attention. Our method uses the attention layer of the discriminator to create an innovative loss function. This allows us to visualize the newly introduced attention heads and show that they indeed capture interesting aspects of two-dimensional geometry of real images.

Related Material


[pdf] [supp] [arXiv] [video]
[bibtex]
@InProceedings{Daras_2020_CVPR,
author = {Daras, Giannis and Odena, Augustus and Zhang, Han and Dimakis, Alexandros G.},
title = {Your Local GAN: Designing Two Dimensional Local Attention Mechanisms for Generative Models},
booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2020}
}