Abstract
Federated learning (FL) allows a central server to train a model using remote users' data. FL faces challenges in preserving the local datasets privacy and in its communication overhead; which is considerably dominant in large-scale networks. These limitations are often mitigated individually by local differential privacy (LDP) mechanisms, compression, and user-selection techniques, which often come at the cost of accuracy. In this work we present compressed private aggregation (CPA), which allows massive deployments to simultaneously communicate at extremely low bit-rates while achieving privacy, anonymity, and resilience to malicious users. CPA randomizes a code-book for compressing the data into a few bits, ensuring anonymity and robustness, with a subsequent perturbation to hold LDP. We provide both a theoretical analysis and a numerical study, demonstrating the performance gains of CPA compared with separate mechanisms for compression and privacy.
Original language | English |
---|---|
Journal | Proceedings - ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing |
DOIs | |
State | Published - 1 Jan 2023 |
Event | 48th IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP 2023 - Rhodes Island, Greece Duration: 4 Jun 2023 → 10 Jun 2023 |
Keywords
- FL
- LDP
- anonymity
- compression
ASJC Scopus subject areas
- Software
- Signal Processing
- Electrical and Electronic Engineering