Redundancy (information theory)

In information theory, redundancy measures the fractional difference between the entropy H(X) of an ensemble X, and its maximum possible value .[1][2] Informally, it is the amount of wasted "space" used to transmit certain data. Data compression is a way to reduce or eliminate unwanted redundancy, while forward error correction is a way of adding desired redundancy for purposes of error detection and correction when communicating over a noisy channel of limited capacity.

  1. ^ Here it is assumed are the sets on which the probability distributions are defined.
  2. ^ MacKay, David J.C. (2003). "2.4 Definition of entropy and related functions". Information Theory, Inference, and Learning Algorithms. Cambridge University Press. p. 33. ISBN 0-521-64298-1. The redundancy measures the fractional difference between H(X) and its maximum possible value,

© MMXXIII Rich X Search. We shall prevail. All rights reserved. Rich X Search