The Translucent Patch: A Physical and Universal Attack on Object Detectors

Alon Zolfi, Moshe Kravchik, Yuval Elovici, Asaf Shabtai; Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2021, pp. 15232-15241

Abstract


Physical adversarial attacks against object detectors have seen increasing success in recent years. However, these attacks require direct access to the object of interest in order to apply a physical patch. Furthermore, to hide multiple objects, an adversarial patch must be applied to each object. In this paper, we propose a contactless translucent physical patch containing a carefully constructed pattern, which is placed on the camera's lens, to fool state-of-the-art object detectors. The primary goal of our patch is to hide all instances of a selected target class. In addition, the optimization method used to construct the patch aims to ensure that the detection of other (untargeted) classes remains unharmed. Therefore, in our experiments, which are conducted on state-of-the-art object detection models used in autonomous driving, we study the effect of the patch on the detection of both the selected target class and the other classes. We show that our patch was able to prevent the detection of 42.27% of all stop sign instances while maintaining high (nearly 80%) detection of the other classes.

Related Material


[pdf] [arXiv]
[bibtex]
@InProceedings{Zolfi_2021_CVPR, author = {Zolfi, Alon and Kravchik, Moshe and Elovici, Yuval and Shabtai, Asaf}, title = {The Translucent Patch: A Physical and Universal Attack on Object Detectors}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}, month = {June}, year = {2021}, pages = {15232-15241} }