Direct Feedback Alignment Based Convolutional Neural Network Training for Low-Power Online Learning Processor

Donghyeon Han, Hoi-jun Yoo; Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), 2019, pp. 0-0

Abstract


There were many algorithms to substitute the backpropagation (BP) in the deep neural network (DNN) training. However, they could not become popular because their training accuracy and the computational efficiency were worse than BP. One of them was direct feedback alignment (DFA), but it showed low training performance especially for the convolutional neural network (CNN). In this paper, we overcome the limitation of the DFA algorithm by combining with the conventional BP during the CNN training. To improve the training stability, we also suggest the feedback weight initialization method by analyzing the patterns of the fixed random matrices in the DFA. Finally, we propose the new training algorithm, binary direct feedback alignment (BDFA) to minimize the computational cost while maintaining the training accuracy compared with the DFA. In our experiments, we use the CIFAR-10 and CIFAR-100 dataset to simulate the CNN learning from the scratch and apply the BDFA to the online learning based object tracking application to examine the training in the small dataset environment. Our proposed algorithms show better performance than conventional BP in both two different training tasks especially when the dataset is small. Furthermore, we examined the efficiency improvement by real chip implementation, and finally, DNN training accelerator with BDFA shows 35.3% lower power consumption compared with hardware which is optimized for BP.

Related Material


[pdf]
[bibtex]
@InProceedings{Han_2019_ICCV,
author = {Han, Donghyeon and Yoo, Hoi-jun},
title = {Direct Feedback Alignment Based Convolutional Neural Network Training for Low-Power Online Learning Processor},
booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV) Workshops},
month = {Oct},
year = {2019}
}