INFORMATION THEORY rests on the fundamental observation that information and uncertainty are related (Shannon and Weaver, 1949). Intuitively, a code can be used to send information from one agent (the transmitter) to another (the receiver) or a channel just in case the receiver cannot completely anticipate which message the transmitter will send. A “language” that consisted of only one sentence could not be a useful instrument of communication precisely because there could be neither a real choice (on the part of the transmitter) nor any real uncertainty (on the part of the receiver) about which message could be sent. Entropy is a measure of the uncertainty in a communication system. Given that uncertainty and information can be identified, we can say that a measure of the uncertainty in a system is also a measure of its information content. Suppose that a communication system provides n distinct symbols and that pi is the probability that the i th symbol occurs; then the entropy, H, is given by:
ABSTRACT
PUBLICATION RECORD
- Publication year
1987
- Venue
Nature
- Publication date
1987-12-01
- Fields of study
Mathematics, Computer Science
- Identifiers
- External record
- Source metadata
Semantic Scholar
CITATION MAP
EXTRACTION MAP
CLAIMS
- No claims are published for this paper.
CONCEPTS
- No concepts are published for this paper.
REFERENCES
Showing 1-6 of 6 references · Page 1 of 1