TY - JOUR
T1 - EDEN
T2 - 39th International Conference on Machine Learning, ICML 2022
AU - Vargaftik, Shay
AU - Basat, Ran Ben
AU - Portnoy, Amit
AU - Mendelson, Gal
AU - Ben-Itzhak, Yaniv
AU - Mitzenmacher, Michael
N1 - Funding Information:
We thank the anonymous reviews and Moshe Gabel for their insightful feedback and suggestions. Michael Mitzenmacher was supported in part by NSF grants CCF-2101140, CNS-2107078, and DMS-2023528. Amit Portnoy was supported in part by the Cyber Security Research Center at Ben-Gurion University of the Negev.
Publisher Copyright:
Copyright © 2022 by the author(s)
PY - 2022/1/1
Y1 - 2022/1/1
N2 - Distributed Mean Estimation (DME) is a central building block in federated learning, where clients send local gradients to a parameter server for averaging and updating the model. Due to communication constraints, clients often use lossy compression techniques to compress the gradients, resulting in estimation inaccuracies. DME is more challenging when clients have diverse network conditions, such as constrained communication budgets and packet losses. In such settings, DME techniques often incur a significant increase in the estimation error leading to degraded learning performance. In this work, we propose a robust DME technique named EDEN that naturally handles heterogeneous communication budgets and packet losses. We derive appealing theoretical guarantees for EDEN and evaluate it empirically. Our results demonstrate that EDEN consistently improves over state-of-the-art DME techniques.
AB - Distributed Mean Estimation (DME) is a central building block in federated learning, where clients send local gradients to a parameter server for averaging and updating the model. Due to communication constraints, clients often use lossy compression techniques to compress the gradients, resulting in estimation inaccuracies. DME is more challenging when clients have diverse network conditions, such as constrained communication budgets and packet losses. In such settings, DME techniques often incur a significant increase in the estimation error leading to degraded learning performance. In this work, we propose a robust DME technique named EDEN that naturally handles heterogeneous communication budgets and packet losses. We derive appealing theoretical guarantees for EDEN and evaluate it empirically. Our results demonstrate that EDEN consistently improves over state-of-the-art DME techniques.
UR - http://www.scopus.com/inward/record.url?scp=85163113597&partnerID=8YFLogxK
M3 - Conference article
AN - SCOPUS:85163113597
SN - 2640-3498
VL - 162
SP - 21984
EP - 22014
JO - Proceedings of Machine Learning Research
JF - Proceedings of Machine Learning Research
Y2 - 17 July 2022 through 23 July 2022
ER -