FedMeNF: Privacy-Preserving Federated Meta-Learning for Neural Fields
FedMeNF: Privacy-Preserving Federated Meta-Learning for Neural Fields
Neural fields provide a memory-efficient representation of data, which can effectively handle diverse modalities and large-scale data. However, learning to map neural fields often requires large amounts of training data and computations, which can be limited to resource-constrained edge devices. One approach to tackle this limitation is to leverage Federated Meta-Learning (FML), but traditional FML approaches suffer from privacy leakage. To address these issues, we introduce a novel FML approach called FedMeNF. FedMeNF utilizes a new privacy-preserving loss function that regulates privacy leakage in the local meta-optimization. This enables the local meta-learner to optimize quickly and efficiently without retaining the client's private data. Our experiments demonstrate that FedMeNF achieves fast optimization speed and robust reconstruction performance, even with few-shot or non-IID data across diverse data modalities, while preserving client data privacy.
Junhyeog Yun、Minui Hong、Gunhee Kim
计算技术、计算机技术
Junhyeog Yun,Minui Hong,Gunhee Kim.FedMeNF: Privacy-Preserving Federated Meta-Learning for Neural Fields[EB/OL].(2025-08-08)[2025-08-24].https://arxiv.org/abs/2508.06301.点此复制
评论