Binary entropy function

Entropy of a Bernoulli trial (in shannons) as a function of binary outcome probability, called the binary entropy function.

In information theory, the binary entropy function, denoted or , is defined as the entropy of a Bernoulli process (i.i.d. binary variable) with probability of one of two values, and is given by the formula:

The base of the logarithm corresponds to the choice of units of information; base e corresponds to nats and is mathematically convenient, while base 2 (binary logarithm) corresponds to shannons and is conventional (as shown in the graph); explicitly:

Note that the values at 0 and 1 are given by the limit (by L'Hôpital's rule); and that "binary" refers to two possible values for the variable, not the units of information.

When , the binary entropy function attains its maximum value, 1 shannon (1 binary unit of information); this is the case of an unbiased coin flip. When or , the binary entropy is 0 (in any units), corresponding to no information, since there is no uncertainty in the variable.


© MMXXIII Rich X Search. We shall prevail. All rights reserved. Rich X Search