Dimension-free empirical entropy estimation.

Doron Cohen, Aryeh Kontorovich, Aaron Koolyk, Geoffrey Wolfer

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review


We seek an entropy estimator for discrete distributions with fully empirical accuracy bounds. As stated, this goal is infeasible without some prior assumptions on the distribution. We discover that a certain information moment assumption renders the problem feasible. We argue that the moment assumption is natural and, in some sense, {\em minimalistic} --- weaker than finite support or tail decay conditions. Under the moment assumption, we provide the first finite-sample entropy estimates for infinite alphabets, nearly recovering the known minimax rates. Moreover, we demonstrate that our empirical bounds are significantly sharper than the state-of-the-art bounds, for various natural distributions and non-trivial sample regimes. Along the way, we give a dimension-free analogue of the Cover-Thomas result on entropy continuity (with respect to total variation distance) for finite alphabets, which may be of independent interest.

Original languageEnglish
Title of host publicationAdvances in Neural Information Processing Systems 34 (NeurIPS 2021)
Number of pages13
StatePublished - 2021


Dive into the research topics of 'Dimension-free empirical entropy estimation.'. Together they form a unique fingerprint.

Cite this