Abstract
Distributed optimization arises in various applications. A widely-used distributed optimizer is the distributed alternating direction method of multipliers (D-ADMM) algorithm, which enables agents to jointly minimize a shared objective by iteratively combining local computations and message exchanges. However, D-ADMM often involves a large number of possibly costly communications to reach convergence, limiting its applicability in communications-constrained networks. In this work we propose unfolded D-ADMM, which facilitates the application of D-ADMM with limited communications using the emerging deep unfolding methodology. We utilize the conventional D-ADMM algorithm with a fixed number of communications rounds, while leveraging data to tune the hyperparameters of each iteration of the algorithm. By doing so, we learn to optimize with limited communications, while preserving the interpretability and flexibility of the original D-ADMM algorithm. Our numerical results demonstrate that the proposed approach dramatically reduces the number of communications utilized by D-ADMM, without compromising on its performance.
Original language | English |
---|---|
Journal | Proceedings - ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing |
DOIs | |
State | Published - 1 Jan 2023 |
Event | 48th IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP 2023 - Rhodes Island, Greece Duration: 4 Jun 2023 → 10 Jun 2023 |
Keywords
- Distributed optimization
- deep unfolding
ASJC Scopus subject areas
- Software
- Signal Processing
- Electrical and Electronic Engineering