Accelerating Federated Learning with Quick Distributed Mean Estimation

Ran Ben-Basat, Shay Vargaftik, Amit Portnoy, Gil Einziger, Yaniv Ben-Itzhak, Michael Mitzenmacher

Research output: Contribution to journalConference articlepeer-review

Abstract

Distributed Mean Estimation (DME), in which n clients communicate vectors to a parameter server that estimates their average, is a fundamental building block in communication-efficient federated learning. In this paper, we improve on previous DME techniques that achieve the optimal O(1/n) Normalized Mean Squared Error (NMSE) guarantee by asymptotically improving the complexity for either encoding or decoding (or both). To achieve this, we formalize the problem in a novel way that allows us to use off-the-shelf mathematical solvers to design the quantization. Using various datasets and training tasks, we demonstrate how QUIC-FL achieves state of the art accuracy with faster encoding and decoding times compared to other DME methods.

Original languageEnglish
Pages (from-to)3410-3442
Number of pages33
JournalProceedings of Machine Learning Research
Volume235
StatePublished - 1 Jan 2024
Event41st International Conference on Machine Learning, ICML 2024 - Vienna, Austria
Duration: 21 Jul 202427 Jul 2024

ASJC Scopus subject areas

  • Artificial Intelligence
  • Software
  • Control and Systems Engineering
  • Statistics and Probability

Fingerprint

Dive into the research topics of 'Accelerating Federated Learning with Quick Distributed Mean Estimation'. Together they form a unique fingerprint.

Cite this