1-Lipschitz Layers Compared: Memory Speed and Certifiable Robustness

Bernd Prach, Fabio Brau, Giorgio Buttazzo, Christoph H. Lampert; Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2024, pp. 24574-24583

Abstract


The robustness of neural networks against input perturbations with bounded magnitude represents a serious concern in the deployment of deep learning models in safety-critical systems. Recently the scientific community has focused on enhancing certifiable robustness guarantees by crafting \ols neural networks that leverage Lipschitz bounded dense and convolutional layers. Different methods have been proposed in the literature to achieve this goal however comparing the performance of such methods is not straightforward since different metrics can be relevant (e.g. training time memory usage accuracy certifiable robustness) for different applications. Therefore this work provides a thorough comparison between different methods covering theoretical aspects such as computational complexity and memory requirements as well as empirical measurements of time per epoch required memory accuracy and certifiable robust accuracy. The paper also provides some guidelines and recommendations to support the user in selecting the methods that work best depending on the available resources. We provide code at github.com/berndprach/1LipschitzLayersCompared

Related Material


[pdf] [supp]
[bibtex]
@InProceedings{Prach_2024_CVPR, author = {Prach, Bernd and Brau, Fabio and Buttazzo, Giorgio and Lampert, Christoph H.}, title = {1-Lipschitz Layers Compared: Memory Speed and Certifiable Robustness}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}, month = {June}, year = {2024}, pages = {24574-24583} }