Lightweight Monocular Depth With a Novel Neural Architecture Search Method

Lam Huynh, Phong Nguyen, Jiří Matas, Esa Rahtu, Janne Heikkilä; Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision (WACV), 2022, pp. 3643-3653

Abstract


This paper presents a novel neural architecture search method, called LiDNAS, for generating lightweight monocular depth estimation models. Unlike previous neural architecture search (NAS) approaches, where finding optimized networks is computationally highly demanding, the introduced novel Assisted Tabu Search leads to efficient architecture exploration. Moreover, we construct the search space on a pre-defined backbone network to balance layer diversity and search space size. The LiDNAS method outperforms the state-of-the-art NAS approach, proposed for disparity and depth estimation, in terms of search efficiency and output model performance. The LiDNAS optimized models achieve result superior to compact depth estimation state-of-the-art on NYU-Depth-v2, KITTI, and ScanNet, while being 7%-500% more compact in size, i.e the number of model parameters.

Related Material


[pdf] [supp] [arXiv]
[bibtex]
@InProceedings{Huynh_2022_WACV, author = {Huynh, Lam and Nguyen, Phong and Matas, Ji\v{r}{\'\i} and Rahtu, Esa and Heikkil\"a, Janne}, title = {Lightweight Monocular Depth With a Novel Neural Architecture Search Method}, booktitle = {Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision (WACV)}, month = {January}, year = {2022}, pages = {3643-3653} }