Information Theory

Rongmei Li,R. Kaptein,D. Hiemstra

Published 1987 in Nature

ABSTRACT

INFORMATION THEORY rests on the fundamental observation that information and uncertainty are related (Shannon and Weaver, 1949). Intuitively, a code can be used to send information from one agent (the transmitter) to another (the receiver) or a channel just in case the receiver cannot completely anticipate which message the transmitter will send. A “language” that consisted of only one sentence could not be a useful instrument of communication precisely because there could be neither a real choice (on the part of the transmitter) nor any real uncertainty (on the part of the receiver) about which message could be sent. Entropy is a measure of the uncertainty in a communication system. Given that uncertainty and information can be identified, we can say that a measure of the uncertainty in a system is also a measure of its information content. Suppose that a communication system provides n distinct symbols and that pi is the probability that the i th symbol occurs; then the entropy, H, is given by:

PUBLICATION RECORD

CITATION MAP

EXTRACTION MAP

CLAIMS

  • No claims are published for this paper.

CONCEPTS

  • No concepts are published for this paper.

REFERENCES

CITED BY

Showing 1-100 of 3488 citing papers · Page 1 of 35