-
[pdf]
[supp]
[arXiv]
[bibtex]@InProceedings{Yun_2025_ICCV, author = {Yun, Junhyeog and Hong, Minui and Kim, Gunhee}, title = {FedMeNF: Privacy-Preserving Federated Meta-Learning for Neural Fields}, booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)}, month = {October}, year = {2025}, pages = {2161-2171} }
FedMeNF: Privacy-Preserving Federated Meta-Learning for Neural Fields
Abstract
Neural fields provide a memory-efficient representation of data, which can effectively handle diverse modalities and large-scale data. However, learning to map neural fields often requires large amounts of training data and computations, which can be limited to resource-constrained edge devices. One approach to tackle this limitation is to leverage Federated Meta-Learning (FML), but traditional FML approaches suffer from privacy leakage. To address these issues, we introduce a novel FML approach called FedMeNF. FedMeNF utilizes a new privacy-preserving loss function that regulates privacy leakage in the local meta-optimization. This enables the local meta-learner to optimize quickly and efficiently without retaining the client's private data. Our experiments demonstrate that FedMeNF achieves fast optimization speed and robust reconstruction performance, even with few-shot or non-IID data across diverse data modalities, while preserving client data privacy.
Related Material
