TY - GEN
T1 - ReNets
T2 - Statically-Optimal Demand-Aware Networks.
AU - Avin, Chen
AU - Schmid, Stefan
N1 - DBLP License: DBLP's bibliographic metadata records provided through http://dblp.org/ are distributed under a Creative Commons CC0 1.0 Universal Public Domain Dedication. Although the bibliographic metadata records are provided consistent with CC0 1.0 Dedication, the content described by the metadata records is not. Content may be subject to copyright, rights of privacy, rights of publicity and other restrictions.
PY - 2021
Y1 - 2021
N2 - This paper studies the design of self-adjusting datacenter networks whose physical topology dynamically adapts to the workload, in an online and demand-aware manner. We propose ReNet, a self-adjusting network which does not require any predictions about future demands and amortizes reconfigurations: it performs as good as a hypothetical static algorithm with perfect knowledge of the future demand. In particular, we show that for arbitrary sparse communication demands, ReNets achieve static optimality, a fundamental property of learning algorithms, and that route lengths in ReNets are proportional to existing lower bounds, which are known to relate to an entropy metric of the demand. ReNets provide additional desirable properties such as compact and local routing and flat addressing therefore ensuring scalability and further reducing the overhead of reconfiguration. To achieve these properties, ReNets combine multiple self-adjusting tree topologies which are optimized toward individual sources, called ego-trees in this paper.
AB - This paper studies the design of self-adjusting datacenter networks whose physical topology dynamically adapts to the workload, in an online and demand-aware manner. We propose ReNet, a self-adjusting network which does not require any predictions about future demands and amortizes reconfigurations: it performs as good as a hypothetical static algorithm with perfect knowledge of the future demand. In particular, we show that for arbitrary sparse communication demands, ReNets achieve static optimality, a fundamental property of learning algorithms, and that route lengths in ReNets are proportional to existing lower bounds, which are known to relate to an entropy metric of the demand. ReNets provide additional desirable properties such as compact and local routing and flat addressing therefore ensuring scalability and further reducing the overhead of reconfiguration. To achieve these properties, ReNets combine multiple self-adjusting tree topologies which are optimized toward individual sources, called ego-trees in this paper.
U2 - 10.1137/1.9781611976489.3
DO - 10.1137/1.9781611976489.3
M3 - Conference contribution
SP - 25
EP - 39
BT - SIAM Symposium on Algorithmic Principles of Computer Systems (APOCS)
PB - Society for Industrial and Applied Mathematics Publications
ER -