On the entropy of a continuous distribution

U. Dinur, R. D. Levine

Research output: Contribution to journalArticlepeer-review

22 Scopus citations


Explicit expressions, free of divergencies, are derived for the entropy and the entropy deficiency (the "information content") of a continuous distribution. From a computational point of view, an algorithm is derived for approximating the entropy by a finite sum of terms. The applications to branching processes are outlined. The divergent term in the previously proposed expression is due to the increase in the entropy when the (continuous) variable is confined to a narrow range. ("It takes an infinite amount of information to pin-point a continuous variable".) It is shown, however, that a "law of diminishing returns" operates. The information content of a distribution over a narrow interval as decreasing with the range of the variable and this decrease exactly compensates for the increase in the information necessary to locate the narrow interval. The net outcome is a finite entropy. The key point in the technical discussion is the condition that the "grouping axiom" apply to the entropy of a continuous distribution.

Original languageEnglish
Pages (from-to)17-27
Number of pages11
JournalChemical Physics
Issue number1-2
StatePublished - 1 Jan 1975
Externally publishedYes

ASJC Scopus subject areas

  • Physics and Astronomy (all)
  • Physical and Theoretical Chemistry


Dive into the research topics of 'On the entropy of a continuous distribution'. Together they form a unique fingerprint.

Cite this