AutoSpace: Neural Architecture Search With Less Human Interference

Daquan Zhou, Xiaojie Jin, Xiaochen Lian, Linjie Yang, Yujing Xue, Qibin Hou, Jiashi Feng; Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), 2021, pp. 337-346


Current neural architecture search (NAS) algorithms still require expert knowledge and effort to design a search space for network construction. In this paper, we consider automating the search space design to minimize human interference, which however faces two challenges: the explosive complexity of the exploration space and the expensive computation cost to evaluate the quality of different search spaces. To solve them, we propose a novel differentiable evolutionary framework named AutoSpace, which evolves the search space to an optimal one with following novel techniques: a differentiable fitness scoring function to efficiently evaluate the performance of cells and a reference architecture to speedup the evolution procedure and avoid falling into sub-optimal solutions. The framework is generic and compatible with additional computational constraints, making it feasible to learn specialized search spaces that fit different computational budgets. With the learned search space, the performance of recent NAS algorithms can be improved significantly compared with using manually de-signed spaces. Remarkably, the models generated from the new search space achieve 77.8% top-1 accuracy on ImageNet under the mobile setting (MAdds<=500M), outperforming previous SOTA EfficientNet-B0 by 0.7%.

Related Material

[pdf] [supp] [arXiv]
@InProceedings{Zhou_2021_ICCV, author = {Zhou, Daquan and Jin, Xiaojie and Lian, Xiaochen and Yang, Linjie and Xue, Yujing and Hou, Qibin and Feng, Jiashi}, title = {AutoSpace: Neural Architecture Search With Less Human Interference}, booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)}, month = {October}, year = {2021}, pages = {337-346} }