X-Nico

3 unusual facts about Claude Shannon


Isaak Yaglom

The channel capacity work of Claude Shannon is developed from first principles in four chapters: probability, entropy and information, information calculation to solve logical problems, and applications to information transmission.

Petoskey, Michigan

Petoskey is the birthplace of information theorist Claude Shannon and Civil War historian Bruce Catton and is the boyhood home of singer-songwriter Sufjan Stevens.

W. Daniel Hillis

In 1985, continuing this research, Hillis received a PhD in EECS from MIT under doctoral advisers Gerald Jay Sussman, Marvin Minsky and Claude Shannon.


Dissociated press

The article included the Turbo Pascal source for two versions of the generator, one using Hayes' algorithm and another using Claude Shannon's Hellbat algorithm.

Entropy in thermodynamics and information theory

There are close parallels between the mathematical expressions for the thermodynamic entropy, usually denoted by S, of a physical system in the statistical thermodynamics established by Ludwig Boltzmann and J. Willard Gibbs in the 1870s, and the information-theoretic entropy, usually expressed as H, of Claude Shannon and Ralph Hartley developed in the 1940s.

Euler diagram

Thus the matter would rest until 1952 when Maurice Karnaugh (1924– ) would adapt and expand a method proposed by Edward W. Veitch; this work would rely on the truth table method precisely defined in Emil Post's 1921 PhD thesis "Introduction to a general theory of elementary propositions" and the application of propositional logic to switching logic by (among others) Claude Shannon, George Stibitz, and Alan Turing.

Shannon–Fano coding

In the field of data compression, Shannon–Fano coding, named after Claude Shannon and Robert Fano, is a technique for constructing a prefix code based on a set of symbols and their probabilities (estimated or measured).

The Information: A History, a Theory, a Flood

Starting with the development of symbolic written language (and the eventual perceived need for a dictionary), Gleick examines the history of intellectual insights central to information theory, detailing the key figures responsible such as Claude Shannon, Charles Babbage, Ada Byron, Samuel Morse, Alan Turing, Stephen Hawking, Richard Dawkins and John Archibald Wheeler.


see also