Dimension-Free Empirical Entropy Estimation

Doron Cohen, Aryeh Kontorovich, Aaron Koolyk, Geoffrey Wolfer

Research output: Contribution to journalArticlepeer-review


We seek an entropy estimator for discrete distributions with fully empirical accuracy bounds. As stated, this goal is infeasible without some prior assumptions on the distribution.We discover that a certain information moment assumption renders the problem feasible. We argue that the moment assumption is natural and, in some sense, <italic>minimalistic</italic> &#x2014; weaker than finite support or tail decay conditions. Under the moment assumption, we provide the first finite-sample entropy estimates for infinite alphabets, nearly recovering the known minimax rates. Moreover, we demonstrate that our empirical bounds are significantly sharper than the state-of-the-art bounds, for various natural distributions and non-trivial sample regimes. Along the way, we give a dimension-free analogue of the Cover-Thomas result on entropy continuity (with respect to total variation distance) for finite alphabets, which may be of independent interest. Additionally, we resolve all of the open problems posed by J&#x00FC;rgensen and Matthews, 2010.

Original languageEnglish
Pages (from-to)1
Number of pages1
JournalIEEE Transactions on Information Theory
StateAccepted/In press - 1 Jan 2022


  • Convergence
  • Entropy
  • Estimation
  • IEEE
  • IEEEtran
  • journal
  • paper
  • Random variables
  • Tail
  • template
  • TV
  • Upper bound

ASJC Scopus subject areas

  • Information Systems
  • Computer Science Applications
  • Library and Information Sciences


Dive into the research topics of 'Dimension-Free Empirical Entropy Estimation'. Together they form a unique fingerprint.

Cite this