Exploring Adversarial Robustness of Vision Transformers in the Spectral Perspective

Gihyun Kim, Juyeop Kim, Jong-Seok Lee; Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision (WACV), 2024, pp. 3976-3985

Abstract


The Vision Transformer has emerged as a powerful tool for image classification tasks, surpassing the performance of convolutional neural networks (CNNs). Recently, many researchers have attempted to understand the robustness of Transformers against adversarial attacks. However, previous researches have focused solely on perturbations in the spatial domain. This paper proposes an additional perspective that explores the adversarial robustness of Transformers against frequency-selective perturbations in the spectral domain. To facilitate comparison between these two domains, an attack framework is formulated as a flexible tool for implementing attacks on images in both the spatial and spectral domains. The experiments reveal that Transformers rely more on phase and low frequency information, which can render them more vulnerable to frequency-selective attacks than CNNs. This work offers new insights into the properties and adversarial robustness of Transformers.

Related Material


[pdf] [supp] [arXiv]
[bibtex]
@InProceedings{Kim_2024_WACV, author = {Kim, Gihyun and Kim, Juyeop and Lee, Jong-Seok}, title = {Exploring Adversarial Robustness of Vision Transformers in the Spectral Perspective}, booktitle = {Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision (WACV)}, month = {January}, year = {2024}, pages = {3976-3985} }