-
[pdf]
[supp]
[bibtex]@InProceedings{Zhou_2024_CVPR, author = {Zhou, Caixia and Huang, Yaping and Pu, Mengyang and Guan, Qingji and Deng, Ruoxi and Ling, Haibin}, title = {MuGE: Multiple Granularity Edge Detection}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}, month = {June}, year = {2024}, pages = {25952-25962} }
MuGE: Multiple Granularity Edge Detection
Abstract
Edge segmentation is well-known to be subjective due to personalized annotation styles and preferred granularity. However most existing deterministic edge detection methods produce only a single edge map for one input image. We argue that generating multiple edge maps is more reasonable than generating a single one considering the subjectivity and ambiguity of the edges. Thus motivated in this paper we propose multiple granularity edge detection called MuGE which can produce a wide range of edge maps from approximate object contours to fine texture edges. Specifically we first propose to design an edge granularity network to estimate the edge granularity from an individual edge annotation. Subsequently to guide the generation of diversified edge maps we integrate such edge granularity into the multi-scale feature maps in the spatial domain. Meanwhile we decompose the feature maps into low-frequency and high-frequency parts where the encoded edge granularity is further fused into the high-frequency part to achieve more precise control over the details of the produced edge maps. Compared to previous methods MuGE is able to not only generate multiple edge maps at different controllable granularities but also achieve a competitive performance on the BSDS500 and Multicue benchmark datasets.
Related Material