Decentralized optimization over noisy, rate-constrained networks: How we agree by talking about how we disagree

Rajarshi Saha, Stefano Rini, Milind Rao, Andrea Goldsmith

研究成果: Conference article同行評審

6 引文 斯高帕斯(Scopus)

摘要

In decentralized optimization, multiple nodes in a network collaborate to minimize the sum of their local loss functions. The information exchange between nodes required for this task is often limited by network connectivity. We consider a generalization of this setting, in which communication is further hindered by (i) a finite data-rate constraint on the signal transmitted by any node, and (ii) an additive noise corrupting the signal received by any node. We develop a novel algorithm for this scenario: Decentralized Lazy Mirror Descent with Differential Exchanges (DLMD-DiffEx), which guarantees convergence of the local estimates to the optimal solution. A salient feature of DLMD-DiffEx is the introduction of additional proxy variables that are maintained by the nodes to account for the disagreement in their estimates due to channel noise and data-rate constraints. We investigate the performance of DLMD-DiffEx both from a theoretical perspective as well as through numerical evaluations.

原文English
頁(從 - 到)5055-5059
頁數5
期刊ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
2021-June
DOIs
出版狀態Published - 6月 2021
事件2021 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2021 - Virtual, Toronto, Canada
持續時間: 6 6月 202111 6月 2021

指紋

深入研究「Decentralized optimization over noisy, rate-constrained networks: How we agree by talking about how we disagree」主題。共同形成了獨特的指紋。

引用此