Differentiable Kernel Evolution
Yu Liu, Jihao Liu, Ailing Zeng, Xiaogang Wang; Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), 2019, pp. 1834-1843
Abstract
This paper proposes a differentiable kernel evolution (DKE) algorithm to find a better layer-operator for the convolutional neural network. Unlike most of the other neural architecture searching (NAS) technologies, we consider the searching space in a fundamental scope: kernel space, which encodes the assembly of basic multiply-accumulate (MAC) operations into a conv-kernel. We first deduce a strict form of the generalized convolutional operator by some necessary constraints and construct a continuous searching space for its extra freedom-of-degree, namely, the connection of each MAC. Then a novel unsupervised greedy evolution algorithm called gradient agreement guided searching (GAGS) is proposed to learn the optimal location for each MAC in the spatially continuous searching space. We leverage DKE on multiple kinds of tasks such as object classification, face/object detection, large-scale fine-grained and recognition, with various kinds of backbone architecture. Not to mention the consistent performance gain, we found the proposed DKE can further act as an auto-dilated operator, which makes it easy to boost the performance of miniaturized neural networks in multiple tasks.
Related Material
[pdf]
[
bibtex]
@InProceedings{Liu_2019_ICCV,
author = {Liu, Yu and Liu, Jihao and Zeng, Ailing and Wang, Xiaogang},
title = {Differentiable Kernel Evolution},
booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)},
month = {October},
year = {2019}
}