Distribution-Induced Bidirectional Generative Adversarial Network for Graph Representation Learning

Shuai Zheng, Zhenfeng Zhu, Xingxing Zhang, Zhizhe Liu, Jian Cheng, Yao Zhao; Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2020, pp. 7224-7233

Abstract


Graph representation learning aims to encode all nodes of a graph into low-dimensional vectors that will serve as input of many computer vision tasks. However, most existing algorithms ignore the existence of inherent data distribution and even noises. This may significantly increase the phenomenon of over-fitting and deteriorate the testing accuracy. In this paper, we propose a Distribution-induced Bidirectional Generative Adversarial Network (named DBGAN) for graph representation learning. Instead of the widely used Gaussian assumption, the prior distribution of latent representation in our DBGAN is estimated in a structure-aware way, which implicitly bridges the graph and content spaces by prototype learning. Thus discriminative and robust representations are generated for all nodes. Furthermore, to improve their generalization ability while preserving representation ability, the sample-level and distribution-level consistency are well balanced via a bidirectional adversarial learning framework. An extensive group of experiments is then carefully designed and presented, demonstrating that our DBGAN obtains remarkably more favorable trade-off between representation and robustness, and meanwhile is dimension-efficient, over currently available alternatives in various tasks.

Related Material


[pdf] [arXiv] [video]
[bibtex]
@InProceedings{Zheng_2020_CVPR,
author = {Zheng, Shuai and Zhu, Zhenfeng and Zhang, Xingxing and Liu, Zhizhe and Cheng, Jian and Zhao, Yao},
title = {Distribution-Induced Bidirectional Generative Adversarial Network for Graph Representation Learning},
booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2020}
}