TY - GEN
T1 - A Sequential Gradient-Based Multiple Access for Distributed Learning over Fading Channels
AU - Sery, Tomer
AU - Cohen, Kobi
N1 - Publisher Copyright:
© 2019 IEEE.
PY - 2019/9/1
Y1 - 2019/9/1
N2 - A distributed learning problem over multiple access channel (MAC) using a large wireless network is considered. The objective function is a sum of the nodes' local loss functions. The inference decision is made by the network edge and is based on received data from distributed nodes which transmit over a noisy fading MAC. We develop a novel Gradient-Based Multiple Access (GBMA) algorithm to solve the distributed learning problem over MAC. Specifically, the nodes transmit an analog function of the local gradient using common shaping waveforms. The network edge receives a superposition of the analog transmitted signals which represents a noisy distorted gradient used for updating the estimate. We analyze the performance of GBMA theoretically, and prove that it can approach the convergence rate of the centralized gradient descent (GD) algorithm in large networks under both convex and strongly convex loss functions with Lipschitz gradient.
AB - A distributed learning problem over multiple access channel (MAC) using a large wireless network is considered. The objective function is a sum of the nodes' local loss functions. The inference decision is made by the network edge and is based on received data from distributed nodes which transmit over a noisy fading MAC. We develop a novel Gradient-Based Multiple Access (GBMA) algorithm to solve the distributed learning problem over MAC. Specifically, the nodes transmit an analog function of the local gradient using common shaping waveforms. The network edge receives a superposition of the analog transmitted signals which represents a noisy distorted gradient used for updating the estimate. We analyze the performance of GBMA theoretically, and prove that it can approach the convergence rate of the centralized gradient descent (GD) algorithm in large networks under both convex and strongly convex loss functions with Lipschitz gradient.
UR - http://www.scopus.com/inward/record.url?scp=85077799183&partnerID=8YFLogxK
U2 - 10.1109/ALLERTON.2019.8919883
DO - 10.1109/ALLERTON.2019.8919883
M3 - Conference contribution
AN - SCOPUS:85077799183
T3 - 2019 57th Annual Allerton Conference on Communication, Control, and Computing, Allerton 2019
SP - 303
EP - 307
BT - 2019 57th Annual Allerton Conference on Communication, Control, and Computing, Allerton 2019
PB - Institute of Electrical and Electronics Engineers
T2 - 57th Annual Allerton Conference on Communication, Control, and Computing, Allerton 2019
Y2 - 24 September 2019 through 27 September 2019
ER -