On the entropy of a continuous distribution

U. Dinur, R. D. Levine

Research output: Contribution to journalArticlepeer-review

22 Scopus citations

Abstract

Explicit expressions, free of divergencies, are derived for the entropy and the entropy deficiency (the "information content") of a continuous distribution. From a computational point of view, an algorithm is derived for approximating the entropy by a finite sum of terms. The applications to branching processes are outlined. The divergent term in the previously proposed expression is due to the increase in the entropy when the (continuous) variable is confined to a narrow range. ("It takes an infinite amount of information to pin-point a continuous variable".) It is shown, however, that a "law of diminishing returns" operates. The information content of a distribution over a narrow interval as decreasing with the range of the variable and this decrease exactly compensates for the increase in the information necessary to locate the narrow interval. The net outcome is a finite entropy. The key point in the technical discussion is the condition that the "grouping axiom" apply to the entropy of a continuous distribution.

Original languageEnglish
Pages (from-to)17-27
Number of pages11
JournalChemical Physics
Volume9
Issue number1-2
DOIs
StatePublished - 1 Jan 1975
Externally publishedYes

ASJC Scopus subject areas

  • General Physics and Astronomy
  • Physical and Theoretical Chemistry

Fingerprint

Dive into the research topics of 'On the entropy of a continuous distribution'. Together they form a unique fingerprint.

Cite this