Tuesday 3 July 2012

INFORMOTION THEORY

Dick Pountain/11 June 2010 11:02/Idealog 191

Claude Shannon founded our  Information Technology industry when he published his classic paper "The Mathematical Theory of Communication" in 1948. It lead directly to the error-correction codes that make possible hard-disks, DVDs, network protocols and more. To make his great breakthrough Shannon had arrived at two crucial abstractions: he abstracted away from the physical fabric that carries messages, thinking of them as pure streams of abstract bits whose values distinguish one message from another, and he deliberately excluded questions about the *meaning* of messages. These abstractions permitted him to measure information as though it were a substance, so we now talk about the bandwidth of a connection in Mbits/sec. Living with these necessary abstractions for so long tempts us to believe that information actually is a substance, spawning a recent spate of books that claim the universe is made out of information - the latest of these is Vlatko Vedral's "Decoding Reality: The Universe as Quantum Information" (OUP 2010). Thing is, I don't believe them.

To me this is to put things exactly the wrong way round: information is not a substance, it's actually the location in space and time of the one substance that exists, matter. That might call for a little more explanation. Shannon wrote about messages conveyed by electrons down copper wires, but electrons are part of matter and, since Einstein, so are photons or any other kind of energy. It's their patterns of presence and absence in space, time (or both) that support messages. But you may object that "matter" is nowadays a far-from-obvious concept. Once we thought it was atoms, then protons, neutrons and electrons, then quarks, next maybe strings? This makes information equally ambiguous: the amount of it needed to describe a system depends on the scale you're looking at. You're currently reading patches of black ink that your eyes and brain automatically isolate from the white paper background of this page and interpret as words: under a microscope you'd see each letter is a collection of printers' dots; a forensic chemist could identify different compounds in the ink and a physicist the atoms composing those ink compounds. So how much information is contained in a word? It depends at what scale you're sampling it, and why.

Shannon's Bell Labs colleague Nyquist left us a deep understanding of sampling: you can extract just part of the information needed to describe a system and obtain a less precise but smaller copy, just as a CD recording does with sound waves and your eye or digital camera does to the torrent of light pouring in from the outside world. In fact all living things exist by sampling information, to discover what's going on both inside and outside their own skins or cell walls. You could almost use this as a definition of life - a chunk of matter that starts to sample information from the rest of the universe. This is a rather different claim from saying that the universe is made of information.

Living things sample their surroundings to avoid danger, find food and reproduce, and each has its own repertoire of chemically-driven actions to achieve these ends. These actions, triggered when certain kinds of message are detected, can be called "emotions". What we normally call emotions - happiness, sadness, anger and so on - are very complex and evolved examples, but for a one-celled creature they might be as simple as swimming towards or away from some particular environmental chemical. Recent advances in neuroscience reveal how these emotional subsystems work in higher animals. Even intentions depend upon emotions, because you can't even twitch a finger unless you "want to", which requires a little squirt of dopamine in your brain. Advanced nervous systems like our own also possess a memory function that stores certain experiences which may be useful in deciding future actions, and it seems likely that what we actually store are processed information samples - sights, sounds, smells - tagged with a marker of the particular emotional state they triggered when first gathered.

So, we should be able to extend Information Theory by reintroducing some notion of meaning, and I propose to call the result "Informotion Theory". A message's meaning for an animal is the emotion/action combination that it triggers, which might be anything from leaping out of a window, recalling a childhood memory, sparking a certain train of thought or a change of mood. Perhaps so much of 20th-century linguistic philosophy feels sterile and circular (snow is white if and only if "snow is white") because it hasn't caught up with such developments. Logic and reason are still needed, but any message received by a real person causes memories to be retrieved in deciphering it, which bring with them emotional states that can't be avoided - the meaning of the message doesn't reside wholly the symbol stream but partly in the memory of the recipient. It's only because we share so many basic experiences (like learning the same language as children) that we can communicate at all. If you'd like to know more, I've put "Sampling Reality", an abridged first volume of my book on this topic, on my website at http://www.dickpountain.co.uk/home/sampling-reality. It's not too big and it's not too clever, do give it a try...

[biog: "Dick Pountain is still smarting at his rejection slip from Mills and Boon".]

No comments:

Post a Comment

SOCIAL UNEASE

Dick Pountain /Idealog 350/ 07 Sep 2023 10:58 Ten years ago this column might have listed a handful of online apps that assist my everyday...