-
[pdf]
[supp]
[bibtex]@InProceedings{Sayyed_2026_WACV, author = {Sayyed, A.Q.M. Sazzad and Bastian, Nathaniel D. and De Lucia, Michael and Swami, Ananthram and Restuccia, Francesco}, title = {CLUE: Bringing Machine Unlearning to Mobile Devices}, booktitle = {Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision (WACV)}, month = {March}, year = {2026}, pages = {3750-3759} }
CLUE: Bringing Machine Unlearning to Mobile Devices
Abstract
Class-level machine unlearning has been proposed to address security and privacy issues of deep neural networks (DNNs). However, existing approaches either exhibit low performance or have excessive computation/storage requirements. This makes them inapplicable in mobile computing scenarios, where computation and memory are severely constrained yet unlearning has to be performed frequently and effectively. This limitation is mainly due to the usage of a retain dataset, i.e., a sub-dataset containing the knowledge that the DNN should maintain after the unlearning. In this paper, we propose CLUE, an unlearning algorithm that does not require a retain dataset. Our key idea is to treat inputs coming from the forget class as out-of-distribution data and to use knowledge distillation to impose this constraint on the updated DNN. We have experimentally evaluated CLUE on Resnet-20, ViT-Base, and ViT-Large DNNs trained on CIFAR10, CIFAR100, andVGGFace2 datasets. We have also implemented CLUE on Raspberry PI and compared the power consumption and latency of CLUE with respect to several existing baselines. We show that CLUE improves power consumption by 68% and latency by 90% while improving the unlearning performance by up to 4.74%.
Related Material
