Due to the all-or-nothing nature of action potentials, single spikes can only encode information in their interspike intervals (ISI).
The Big Bang Theory | Information Technology | United States Information Agency | Information technology | National Center for Biotechnology Information | information technology | Theory of a Deadman | music theory | Program and System Information Protocol | Geographic information system | information | probability theory | IPTC Information Interchange Model | World Summit on the Information Society | theory | Theory of relativity | Right to Information Act | National Telecommunications and Information Administration | theory of relativity | Social learning theory | Game Theory (band) | Game Theory | Conspiracy theory | Central Office of Information | public information film | Music theory | K-theory | intelligence (information gathering) | Information technology consulting | Freedom of information in the United Kingdom |
Similarly, according to the chemist John Avery, from his recent 2003 book Information Theory and Evolution, we find a presentation in which the phenomenon of life, including its origin and evolution, as well as human cultural evolution, has its basis in the background of thermodynamics, statistical mechanics, and information theory.
The field of ecosystem network analysis, developed by Robert Ulanowicz and others, involves using concepts from information theory and thermodynamics to study the evolution of these networks over time.
Research topics span from theoretical computer science, such as Formal languages, Formal methods, or more mathematically-oriented topics such as Information theory, optimization, Complex system... to application-driven topics like Bioinformatics, image and video compression, Handwriting recognition, Computer graphics, Medical imaging, Content-based image retrieval...
Sanjeev Ramesh Kulkarni (born Mumbai, India, September 21, 1963) is Professor of Electrical Engineering at Princeton University, where he teaches and conducts research in a broad range of areas including statistical inference, pattern recognition, machine learning, information theory, and signal/image processing.
Starting with the development of symbolic written language (and the eventual perceived need for a dictionary), Gleick examines the history of intellectual insights central to information theory, detailing the key figures responsible such as Claude Shannon, Charles Babbage, Ada Byron, Samuel Morse, Alan Turing, Stephen Hawking, Richard Dawkins and John Archibald Wheeler.
He has studied the application of information theory to problems in biology and published his conclusions in the Journal of Theoretical Biology from 1974 onwards.
The physical chemist Arieh Ben-Naim rederived the Sackur–Tetrode equation for entropy in terms of information theory, and in doing so he tied in well known concepts from modern physics.