Skip to content

Information Theory

  • Quantifies information using entropy, a measure of a system’s uncertainty or randomness.
  • Uses redundancy and encoding (for example, error-correcting codes) to make message transmission more reliable over noisy channels.
  • Originated in the 1940s and underpins modern communications and computer systems.

Information theory is a branch of applied mathematics and electrical engineering that deals with the study of the transmission, storage, and processing of information. It was developed in the 1940s by Claude Shannon and focuses on quantifying information using concepts such as entropy and redundancy to design efficient and reliable communication systems.

At its core, information theory concerns the quantification of information. Entropy is used as a measure of the uncertainty or randomness of a system. The concept of redundancy refers to the repetition of information within a message to improve the likelihood that the message is transmitted accurately despite noise or loss in the channel. Encoding schemes, including error-correcting codes, add controlled redundancy to messages so they can be transmitted more efficiently and reliably.

If we have a coin that we know is fair, the entropy of the system is low because we know with certainty what the outcome of a coin flip will be. However, if we have a coin that we don’t know anything about, the entropy of the system is high because we have no information about the outcome of a coin flip.

If we send the message “hello” over a noisy channel, there is a chance that some of the letters will be garbled or lost. However, if we repeat the message multiple times, it becomes much less likely that the entire message will be lost, because there is a higher chance that at least one of the repeated versions will be transmitted accurately.

  • Designing efficient and reliable communication systems.
  • Constructing encoding schemes such as error-correcting codes to add redundancy in a controlled and efficient way.
  • Foundations for modern communications and computer systems.
  • Claude Shannon
  • Entropy
  • Redundancy
  • Encoding schemes
  • Error-correcting codes
  • Noisy channel