Many scientists and engineers, including, surprisingly, some psychologists and neuroscientists, seem to think that what the word “information” and its equivalents in other languages refer to is what Claude Shannon’s ground-breaking 1948 paper referred to as “information”: a measurable property of signals that can be stored, transmitted, compared, compressed, decompressed, corrupted, repaired, encrypted, decrypted, etc. (Shannon, 1948).
Many of Shannon’s admirers seem to have forgotten that there is a much older, widely used, theoretically important notion of “information”, which was familiar to Jane Austen and used in her novels, and also occurs in non-technical, conversational, uses of the word “information”. This concept of information is essential for our understanding of biological evolution and its products (including humans) and for attempts to understand what natural intelligence is and how it works, including attempts to model and replicate natural intelligence in machines.
Shannon himself did not make this mistake of conflating the old concept of semantic information with what he called “information”. Margaret Boden comments on this in her two volume survey of cognitive science and its history (2006):