U2RLE: Uncertainty-Guided 2-Stage Room Layout Estimation

Pooya Fayyazsanavi, Zhiqiang Wan, Will Hutchcroft, Ivaylo Boyadzhiev, Yuguang Li, Jana Kosecka, Sing Bing Kang; Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, 2023, pp. 3562-3570

Abstract


While the existing deep learning-based room layout estimation techniques demonstrate good overall accuracy, they are less effective for distant floor-wall boundary. To tackle this problem, we propose a novel uncertainty-guided approach for layout boundary estimation introducing new two-stage CNN architecture termed U2RLE. The initial stage predicts both floor-wall boundary and its uncertainty and is followed by the refinement of boundaries with high positional uncertainty using a different, distance-aware loss. Finally, outputs from the two stages are merged to produce the room layout. Experiments using ZInD and Structure3D datasets show that U2RLE improves over current state-of-the-art, being able to handle both near and far walls better. In particular, U2RLE outperforms current state-of-the-art techniques for the most distant walls.

Related Material


[pdf] [supp]
[bibtex]
@InProceedings{Fayyazsanavi_2023_CVPR, author = {Fayyazsanavi, Pooya and Wan, Zhiqiang and Hutchcroft, Will and Boyadzhiev, Ivaylo and Li, Yuguang and Kosecka, Jana and Kang, Sing Bing}, title = {U2RLE: Uncertainty-Guided 2-Stage Room Layout Estimation}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops}, month = {June}, year = {2023}, pages = {3562-3570} }