On Using Secure Aggregation in Differentially Private Federated Learning with Multiple Local Steps
On Using Secure Aggregation in Differentially Private Federated Learning with Multiple Local Steps
Federated learning is a distributed learning setting where the main aim is to train machine learning models without having to share raw data but only what is required for learning. To guarantee training data privacy and high-utility models, differential privacy and secure aggregation techniques are often combined with federated learning. However, with fine-grained protection granularities, e.g., with the common sample-level protection, the currently existing techniques generally require the parties to communicate for each local optimization step, if they want to fully benefit from the secure aggregation in terms of the resulting formal privacy guarantees. In this paper, we show how a simple new analysis allows the parties to perform multiple local optimization steps while still benefiting from using secure aggregation. We show that our analysis enables higher utility models with guaranteed privacy protection under limited number of communication rounds.
Mikko A. Heikkil?
计算技术、计算机技术
Mikko A. Heikkil?.On Using Secure Aggregation in Differentially Private Federated Learning with Multiple Local Steps[EB/OL].(2024-07-27)[2025-04-26].https://arxiv.org/abs/2407.19286.点此复制
评论