Convolutional Neural Networks With Alternately Updated Clique

Yibo Yang, Zhisheng Zhong, Tiancheng Shen, Zhouchen Lin; The IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2018, pp. 2413-2422

Abstract


Improving information flow in deep networks helps to ease the training difficulties and utilize parameters more efficiently. Here we propose a new convolutional neural network architecture with alternately updated clique (CliqueNet). In contrast to prior networks, there are both forward and backward connections between any two layers in the same block. The layers are constructed as a loop and are updated alternately. The CliqueNet has some unique properties. For each layer, it is both the input and output of any other layer in the same block, so that the information flow among layers is maximized. During propagation, the newly updated layers are concatenated to re-update previously updated layer, and parameters are reused for multiple times. This recurrent feedback structure is able to bring higher level visual information back to refine low-level filters and achieve spatial attention. We analyze the features generated at different stages and observe that using refined features leads to a better result. We adopt a multi-scale feature strategy that effectively avoids the progressive growth of parameters. Experiments on image recognition datasets including CIFAR-10, CIFAR-100, SVHN and ImageNet show that our proposed models achieve the state-of-the-art performance with fewer parameters.

Related Material


[pdf] [arXiv]
[bibtex]
@InProceedings{Yang_2018_CVPR,
author = {Yang, Yibo and Zhong, Zhisheng and Shen, Tiancheng and Lin, Zhouchen},
title = {Convolutional Neural Networks With Alternately Updated Clique},
booktitle = {The IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2018}
}