Progressive Automatic Design of Search Space for One-Shot Neural Architecture Search

Xin Xia, Xuefeng Xiao, Xing Wang, Min Zheng; Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision (WACV), 2022, pp. 2455-2464

Abstract


Neural Architecture Search (NAS) has attracted growing interest. To reduce the search cost, recent work has explored weight sharing across models and made major progress in One-Shot NAS. However, it has been observed that a model with higher one-shot model accuracy does not necessarily perform better when stand-alone trained. To address this issue, in this paper, we propose Progressive Automatic Design of search space, named PAD-NAS. Unlike previous approaches where the same operation search space is shared by all the layers in the supernet, we formulate a progressive search strategy based on operation pruning and build a layer-wise operation search space. In this way, PAD-NAS can automatically design the operations for each layer and achieve a trade-off between search space quality and model diversity. During the search, we also take the hardware platform constraints into consideration for efficient neural network model deployment. Extensive experiments on ImageNet show that our method can achieve state-of-the-art performance.

Related Material


[pdf] [arXiv]
[bibtex]
@InProceedings{Xia_2022_WACV, author = {Xia, Xin and Xiao, Xuefeng and Wang, Xing and Zheng, Min}, title = {Progressive Automatic Design of Search Space for One-Shot Neural Architecture Search}, booktitle = {Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision (WACV)}, month = {January}, year = {2022}, pages = {2455-2464} }