Compressed Private Aggregation for Scalable and Robust Federated Learning over Massive Networks

Natalie Lang, Nir Shlezinger, Rafael G.L. D'Oliveira, Salim El Rouayheb

Research output: Contribution to journalArticlepeer-review

Abstract

Federated learning (FL) is an emerging paradigm that allows a central server to train machine learning models using remote users' data. Despite its growing popularity, FL faces challenges in preserving the privacy of local datasets, its sensitivity to poisoning attacks by malicious users, and its communication overhead, especially in large-scale networks. These limitations are often individually mitigated by local differential privacy (LDP) mechanisms, robust aggregation, compression, and user selection techniques, which typically come at the cost of accuracy. In this work, we present compressed private aggregation (CPA), allowing massive deployments to simultaneously communicate at extremely low bit rates while achieving privacy, anonymity, and resilience to malicious users. CPA randomizes a codebook for compressing the data into a few bits using nested lattice quantizers, while ensuring anonymity and robustness, with a subsequent perturbation to hold LDP. CPA-aided FL is proven to converge in the same asymptotic rate as FL without privacy, compression, and robustness considerations, while satisfying both anonymity and LDP requirements. These analytical properties are empirically confirmed in a numerical study, where we demonstrate the performance gains of CPA compared with separate mechanisms for compression and privacy, as well as its robustness in mitigating the harmful effects of malicious users.

Original languageEnglish
JournalIEEE Transactions on Mobile Computing
DOIs
StateAccepted/In press - 1 Jan 2025

Keywords

  • Federated learning
  • anonymity
  • compression
  • local differential privacy

ASJC Scopus subject areas

  • Software
  • Computer Networks and Communications
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Compressed Private Aggregation for Scalable and Robust Federated Learning over Massive Networks'. Together they form a unique fingerprint.

Cite this