PR Product: A Substitute for Inner Product in Neural Networks

Zhennan Wang, Wenbin Zou, Chen Xu; Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), 2019, pp. 6013-6022

Abstract


In this paper, we analyze the inner product of weight vector w and data vector x in neural networks from the perspective of vector orthogonal decomposition and prove that the direction gradient of w decreases with the angle between them close to 0 or p. We propose the Projection and Rejection Product (PR Product) to make the direction gradient of w independent of the angle and consistently larger than the one in standard inner product while keeping the forward propagation identical. As a reliable substitute for standard inner product, the PR Product can be applied into many existing deep learning modules, so we develop the PR Product version of fully connected layer, convolutional layer and LSTM layer. In static image classification, the experiments on CIFAR10 and CIFAR100 datasets demonstrate that the PR Product can robustly enhance the ability of various state-of-the-art classification networks. On the task of image captioning, even without any bells and whistles, our PR Product version of captioning model can compete or outperform the state-of-the-art models on MS COCO dataset. Code has been made available at: https://github.com/wzn0828/PR_Product.

Related Material


[pdf] [supp] [video]
[bibtex]
@InProceedings{Wang_2019_ICCV,
author = {Wang, Zhennan and Zou, Wenbin and Xu, Chen},
title = {PR Product: A Substitute for Inner Product in Neural Networks},
booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)},
month = {October},
year = {2019}
}