TY - JOUR
T1 - Compressed Private Aggregation for Scalable and Robust Federated Learning over Massive Networks
AU - Lang, Natalie
AU - Shlezinger, Nir
AU - D'Oliveira, Rafael G.L.
AU - Rouayheb, Salim El
N1 - Publisher Copyright:
© 2002-2012 IEEE.
PY - 2025/1/1
Y1 - 2025/1/1
N2 - Federated learning (FL) is an emerging paradigm that allows a central server to train machine learning models using remote users' data. Despite its growing popularity, FL faces challenges in preserving the privacy of local datasets, its sensitivity to poisoning attacks by malicious users, and its communication overhead, especially in large-scale networks. These limitations are often individually mitigated by local differential privacy (LDP) mechanisms, robust aggregation, compression, and user selection techniques, which typically come at the cost of accuracy. In this work, we present compressed private aggregation (CPA), allowing massive deployments to simultaneously communicate at extremely low bit rates while achieving privacy, anonymity, and resilience to malicious users. CPA randomizes a codebook for compressing the data into a few bits using nested lattice quantizers, while ensuring anonymity and robustness, with a subsequent perturbation to hold LDP. CPA-aided FL is proven to converge in the same asymptotic rate as FL without privacy, compression, and robustness considerations, while satisfying both anonymity and LDP requirements. These analytical properties are empirically confirmed in a numerical study, where we demonstrate the performance gains of CPA compared with separate mechanisms for compression and privacy, as well as its robustness in mitigating the harmful effects of malicious users.
AB - Federated learning (FL) is an emerging paradigm that allows a central server to train machine learning models using remote users' data. Despite its growing popularity, FL faces challenges in preserving the privacy of local datasets, its sensitivity to poisoning attacks by malicious users, and its communication overhead, especially in large-scale networks. These limitations are often individually mitigated by local differential privacy (LDP) mechanisms, robust aggregation, compression, and user selection techniques, which typically come at the cost of accuracy. In this work, we present compressed private aggregation (CPA), allowing massive deployments to simultaneously communicate at extremely low bit rates while achieving privacy, anonymity, and resilience to malicious users. CPA randomizes a codebook for compressing the data into a few bits using nested lattice quantizers, while ensuring anonymity and robustness, with a subsequent perturbation to hold LDP. CPA-aided FL is proven to converge in the same asymptotic rate as FL without privacy, compression, and robustness considerations, while satisfying both anonymity and LDP requirements. These analytical properties are empirically confirmed in a numerical study, where we demonstrate the performance gains of CPA compared with separate mechanisms for compression and privacy, as well as its robustness in mitigating the harmful effects of malicious users.
KW - Federated learning
KW - anonymity
KW - compression
KW - local differential privacy
UR - http://www.scopus.com/inward/record.url?scp=105004375094&partnerID=8YFLogxK
U2 - 10.1109/TMC.2025.3564390
DO - 10.1109/TMC.2025.3564390
M3 - Article
AN - SCOPUS:105004375094
SN - 1536-1233
JO - IEEE Transactions on Mobile Computing
JF - IEEE Transactions on Mobile Computing
ER -