Collection: Shannon Information Theory

The landmark event that established the discipline of information theory and brought it to immediate worldwide attention was the publication of Claude E. Shannon’s classic paper A Mathematical Theory of Communication in the Bell System Technical Journal in July and October 1948.

Prior to this paper, limited information-theoretic ideas had been developed at Bell Labs, all implicitly assuming events of equal probability. Harry Nyquist’s 1924 paper, Certain Factors Affecting Telegraph Speed, contains a theoretical section quantifying “intelligence” and the “line speed” at which it can be transmitted by a communication system, giving the relation W = K log m (recalling Boltzmann’s constant), where W is the speed of transmission of intelligence, m is the number of different voltage levels to choose from at each time step, and K is a constant. Ralph Hartley’s 1928 paper, Transmission of Information, uses the word information as a measurable quantity, reflecting the receiver’s ability to distinguish one sequence of symbols from any other, thus quantifying information as H = log Sn = n log S, where S was the number of possible symbols, and n the number of symbols in a transmission. The unit of information was therefore the decimal digit, which has since sometimes been called the hartley in his honor as a unit or scale or measure of information.

Alan Turing in 1940 used similar ideas as part of the statistical analysis of the breaking of the German second world war Enigma ciphers.

Much of the mathematics behind information theory with events of different probabilities were developed for the field of thermodynamics by Ludwig Boltzmann and J. Willard Gibbs. Connections between information-theoretic entropy and thermodynamic entropy, including the important contributions by Rolf Landauer in the 1960s, are explored in Entropy in thermodynamics and information theory.

In Shannon’s revolutionary and groundbreaking paper, the work for which had been substantially completed at Bell Labs by the end of 1944, Shannon for the first time introduced the qualitative and quantitative model of communication as a statistical process underlying information theory, opening with the assertion that:

“The fundamental problem of communication is that of reproducing at one point, either exactly or approximately, a message selected at another point.”

With it came the ideas of:

  • the information entropy and redundancy of a source, and its relevance through the source coding theorem;
  • the mutual information, and the channel capacity of a noisy channel, including the promise of perfect loss-free communication given by the noisy-channel coding theorem;
  • the practical result of the Shannon–Hartley law for the channel capacity of a Gaussian channel; as well as
  • the bit—a new way of seeing the most fundamental unit of information.
14 products
  • Shannon Embroidered Flat Bill Cap
    Regular price
    $21.50 USD
    Sale price
    $21.50 USD
    Regular price
    Unit price
    per 
    Sold out
  • Shannon Unisex Fashion Hoodie
    Regular price
    from $34.50 USD
    Sale price
    from $34.50 USD
    Regular price
    Unit price
    per 
    Sold out
  • Shannon Hoodie
    Regular price
    from $30.00 USD
    Sale price
    from $30.00 USD
    Regular price
    Unit price
    per 
    Sold out
  • Shannon Unisex Tri-Blend Track Shirt
    Regular price
    from $20.00 USD
    Sale price
    from $20.00 USD
    Regular price
    Unit price
    per 
    Sold out
  • Shannon Unisex Crew Neck Slim Fit Tee
    Regular price
    from $19.50 USD
    Sale price
    from $19.50 USD
    Regular price
    Unit price
    per 
    Sold out
  • Shannon Mug with Color Inside
    Regular price
    $12.00 USD
    Sale price
    $12.00 USD
    Regular price
    Unit price
    per 
    Sold out