Explicit expressions, free of divergencies, are derived for the entropy and the entropy deficiency (the "information content") of a continuous distribution. From a computational point of view, an algorithm is derived for approximating the entropy by a finite sum of terms. The applications to branching processes are outlined. The divergent term in the previously proposed expression is due to the increase in the entropy when the (continuous) variable is confined to a narrow range. ("It takes an infinite amount of information to pin-point a continuous variable".) It is shown, however, that a "law of diminishing returns" operates. The information content of a distribution over a narrow interval as decreasing with the range of the variable and this decrease exactly compensates for the increase in the information necessary to locate the narrow interval. The net outcome is a finite entropy. The key point in the technical discussion is the condition that the "grouping axiom" apply to the entropy of a continuous distribution.
ASJC Scopus subject areas
- Physics and Astronomy (all)
- Physical and Theoretical Chemistry