TY - GEN

T1 - Capacity and Construction of Recoverable Systems

AU - Elishco, Ohad

AU - Barg, Alexander

N1 - Funding Information:
Ohad Elishco is with Ben-Gurion University of the Negev, Israel, Email: ohadeli@bgu.ac.il. Alexander Barg is with Dept. of ECE and ISR, University of Maryland, College Park, MD 20742, USA and with with IITP, Russian Academy of Sciences, 127051 Moscow, Russia. Email abarg@umd.edu. This research was supported in part by NSF grant CCF1814487.
Publisher Copyright:
© 2021 IEEE.

PY - 2021/7/12

Y1 - 2021/7/12

N2 - Motivated by the established notion of storage codes, we consider sets of infinite sequences over a finite alphabet such that every k-tuple of consecutive entries is uniquely recoverable from its l-neighborhood in the sequence. In the first part of the paper we address the problem of finding the maximum growth rate of the set as well as constructions of explicit families (based on constrained coding) that approach the optimal rate. In the second part we consider a modification of the problem wherein the entries in the sequence are viewed as random variables over a finite alphabet, and the recovery condition requires that the Shannon entropy of the k-tuple conditioned on its l-neighborhood be bounded above by some \epsilon > 0. We study properties of measures on infinite sequences that maximize the metric entropy under the recoverability condition. Drawing on tools from ergodic theory, we prove some properties of entropy-maximizing measures. We also suggest a procedure of constructing an \epsilon-recoverable measure from a corresponding deterministic system, and prove that for small \epsilon the constructed measure is a maximizer of the metric entropy.

AB - Motivated by the established notion of storage codes, we consider sets of infinite sequences over a finite alphabet such that every k-tuple of consecutive entries is uniquely recoverable from its l-neighborhood in the sequence. In the first part of the paper we address the problem of finding the maximum growth rate of the set as well as constructions of explicit families (based on constrained coding) that approach the optimal rate. In the second part we consider a modification of the problem wherein the entries in the sequence are viewed as random variables over a finite alphabet, and the recovery condition requires that the Shannon entropy of the k-tuple conditioned on its l-neighborhood be bounded above by some \epsilon > 0. We study properties of measures on infinite sequences that maximize the metric entropy under the recoverability condition. Drawing on tools from ergodic theory, we prove some properties of entropy-maximizing measures. We also suggest a procedure of constructing an \epsilon-recoverable measure from a corresponding deterministic system, and prove that for small \epsilon the constructed measure is a maximizer of the metric entropy.

UR - http://www.scopus.com/inward/record.url?scp=85115091683&partnerID=8YFLogxK

U2 - 10.1109/ISIT45174.2021.9518011

DO - 10.1109/ISIT45174.2021.9518011

M3 - Conference contribution

AN - SCOPUS:85115091683

T3 - IEEE International Symposium on Information Theory - Proceedings

SP - 3273

EP - 3278

BT - 2021 IEEE International Symposium on Information Theory, ISIT 2021 - Proceedings

PB - Institute of Electrical and Electronics Engineers

T2 - 2021 IEEE International Symposium on Information Theory, ISIT 2021

Y2 - 12 July 2021 through 20 July 2021

ER -