Fast-NTK: Parameter-Efficient Unlearning for Large-Scale Models

Guihong Li, Hsiang Hsu, Chun-Fu Chen, Radu Marculescu; Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, 2024, pp. 227-234

Abstract


The rapid growth of machine learning has spurred legislative initiatives such as "the Right to be Forgotten" allowing users to request data removal. In response machine unlearning proposes the selective removal of unwanted data without the need for retraining from scratch. While the Neural-Tangent-Kernel (NTK) based unlearning method excels in performance it suffers from significant computational complexity especially for large-scale models and datasets. To improve this situation our work introduces "Fast-NTK" a novel NTK-based unlearning algorithm that significantly reduces the computational complexity by incorporating parameter-efficient fine-tuning methods such as fine-tuning batch normalization layers in a CNN or visual prompts in a vision transformer. Our experimental results demonstrate scalability to really large neural networks and datasets (e.g. 88M parameters and 5k images) surpassing the limitations of previous full-model NTK-based approaches designed for smaller cases (e.g. 8M parameters and 500 images). Notably our approach maintains a performance comparable to the traditional methods of retraining on the retain set alone. Fast-NTK can thus enable practical and scalable NTK-based unlearning in deep neural networks.

Related Material


[pdf]
[bibtex]
@InProceedings{Li_2024_CVPR, author = {Li, Guihong and Hsu, Hsiang and Chen, Chun-Fu and Marculescu, Radu}, title = {Fast-NTK: Parameter-Efficient Unlearning for Large-Scale Models}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops}, month = {June}, year = {2024}, pages = {227-234} }