Direct Differentiable Augmentation Search

Aoming Liu, Zehao Huang, Zhiwu Huang, Naiyan Wang; Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), 2021, pp. 12219-12228

Abstract


Data augmentation has been an indispensable tool to improve the performance of deep neural networks, however the augmentation can hardly transfer among different tasks and datasets. Consequently, a recent trend is to adopt AutoML technique to learn proper augmentation policy without extensive hand-crafted tuning. In this paper, we propose an efficient differentiable search algorithm called Direct Differentiable Augmentation Search (DDAS). It utilizes meta-learning with one-step gradient update and continuous relaxation to the expected training loss for efficient search. Our DDAS could achieve efficient augmentation search without approximations such as Gumbel-Softmax or second order gradient approximation. To further reduce the adverse effect of improper augmentations, we organize the search space into a two level hierarchy, in which we first decide whether to apply augmentation, and then determine the specific augmentation policy. On standard image classification benchmarks, our DDAS achieves state-of-the-art performance and efficiency tradeoff while reducing the search cost dramatically, e.g. 0.15 GPU hours for CIFAR-10. In addition, we also use DDAS to search augmentation for object detection task and achieve comparable performance with AutoAugment, while being 1000x faster. Code will be released in https://github.com/zxcvfd13502/DDAS_code.

Related Material


[pdf] [supp] [arXiv]
[bibtex]
@InProceedings{Liu_2021_ICCV, author = {Liu, Aoming and Huang, Zehao and Huang, Zhiwu and Wang, Naiyan}, title = {Direct Differentiable Augmentation Search}, booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)}, month = {October}, year = {2021}, pages = {12219-12228} }