Learning Laplacians in Chebyshev Graph Convolutional Networks

Hichem Sahbi; Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV) Workshops, 2021, pp. 2064-2075

Abstract


Spectral graph convolutional networks (GCNs) are particular deep models which aim at extending neural networks to arbitrary irregular domains. The principle of these networks consists in projecting graph signals using the eigen-decomposition of their Laplacians, then achieving filtering in the spectral domain prior to back-project the resulting filtered signals onto the input graph domain. However, the success of these operations is highly dependent on the relevance of the used Laplacians which are mostly handcrafted and this makes GCNs clearly sub-optimal. In this paper, we introduce a novel spectral GCN that learns not only the usual convolutional parameters but also the Laplacian operators. The latter are designed "end-to-end" as a part of a recursive Chebyshev decomposition with the particularity of conveying both the differential and the non-differential properties of the learned representations -- with increasing order and discrimination power -- without overparametrizing the trained GCNs. Extensive experiments, conducted on the challenging task of skeleton-based action recognition, show the generalization ability and the outperformance of our proposed Laplacian design w.r.t. different baselines (built upon handcrafted and other learned Laplacians) as well as the related work.

Related Material


[pdf]
[bibtex]
@InProceedings{Sahbi_2021_ICCV, author = {Sahbi, Hichem}, title = {Learning Laplacians in Chebyshev Graph Convolutional Networks}, booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV) Workshops}, month = {October}, year = {2021}, pages = {2064-2075} }