-
[pdf]
[supp]
[bibtex]@InProceedings{Ta_2025_CVPR, author = {Ta, Huu Binh and Nguyen, Duc and Tran, Quyen and Tran, Toan and Pham, Tung}, title = {Low-Rank Adaptation in Multilinear Operator Networks for Security-Preserving Incremental Learning}, booktitle = {Proceedings of the Computer Vision and Pattern Recognition Conference (CVPR)}, month = {June}, year = {2025}, pages = {24341-24350} }
Low-Rank Adaptation in Multilinear Operator Networks for Security-Preserving Incremental Learning
Abstract
In security-sensitive fields, data should be encrypted to protect against unauthorized access and maintain confidentiality throughout processing. However, traditional networks like ViTs and CNNs return different results when processing original data versus its encrypted form, meaning that they require data to be decrypted, posing a security risk by exposing sensitive information. One solution for this issue is using polynomial networks, including state-of-the-art Multilinear Operator Networks, which return the same outputs given the real data and their encrypted forms under Leveled Fully Homomorphic Encryption. Nevertheless, these models are susceptible to catastrophic forgetting in incremental learning settings. Thus, this paper will present a new low-rank adaptation method combined with the Gradient Projection Memory mechanism to minimize the issue. Our proposal is compatible with Leveled Fully Homomorphic Encryption while achieving a sharp improvement in performance compared to existing models.
Related Material