Quantities of information

A misleading[1] information diagram showing additive and subtractive relationships among Shannon's basic quantities of information for correlated variables and . The area contained by both circles is the joint entropy . The circle on the left (red and violet) is the individual entropy , with the red being the conditional entropy . The circle on the right (blue and violet) is , with the blue being . The violet is the mutual information .

The mathematical theory of information is based on probability theory and statistics, and measures information with several quantities of information. The choice of logarithmic base in the following formulae determines the unit of information entropy that is used. The most common unit of information is the bit, or more correctly the shannon,[2] based on the binary logarithm. Although "bit" is more frequently used in place of "shannon", its name is not distinguished from the bit as used in data-processing to refer to a binary value or stream regardless of its entropy (information content) Other units include the nat, based on the natural logarithm, and the hartley, based on the base 10 or common logarithm.

In what follows, an expression of the form is considered by convention to be equal to zero whenever is zero. This is justified because for any logarithmic base.[3]

  1. ^ D.J.C. Mackay (2003). Information theory, inferences, and learning algorithms. Bibcode:2003itil.book.....M.: 141 
  2. ^ Stam, A.J. (1959). "Some inequalities satisfied by the quantities of information of Fisher and Shannon". Information and Control. 2 (2): 101–112. doi:10.1016/S0019-9958(59)90348-1.
  3. ^ "Three approaches to the definition of the concept "quantity of information"" (PDF).

© MMXXIII Rich X Search. We shall prevail. All rights reserved. Rich X Search