CPA: Compressed Private Aggregation for Scalable Federated Learning Over Massive Networks

Natalie Lang, Elad Sofer, Nir Shlezinger, Rafael G.L. D'Oliveira, Salim El Rouayheb

Research output: Contribution to journalConference articlepeer-review

1 Scopus citations


Federated learning (FL) allows a central server to train a model using remote users' data. FL faces challenges in preserving the local datasets privacy and in its communication overhead; which is considerably dominant in large-scale networks. These limitations are often mitigated individually by local differential privacy (LDP) mechanisms, compression, and user-selection techniques, which often come at the cost of accuracy. In this work we present compressed private aggregation (CPA), which allows massive deployments to simultaneously communicate at extremely low bit-rates while achieving privacy, anonymity, and resilience to malicious users. CPA randomizes a code-book for compressing the data into a few bits, ensuring anonymity and robustness, with a subsequent perturbation to hold LDP. We provide both a theoretical analysis and a numerical study, demonstrating the performance gains of CPA compared with separate mechanisms for compression and privacy.


  • FL
  • LDP
  • anonymity
  • compression

ASJC Scopus subject areas

  • Software
  • Signal Processing
  • Electrical and Electronic Engineering


Dive into the research topics of 'CPA: Compressed Private Aggregation for Scalable Federated Learning Over Massive Networks'. Together they form a unique fingerprint.

Cite this