Monday, 2 July 2012

LONG DISTANCE INFORMATION

Dick Pountain/Thu 20 November 2003/10:00 am/Idealog 112

A year or so back, in issue 98, I devoted this column to talking about information theory, and expressed the opinion that its inventor Claude Shannon deserves to be ranked alongside Albert Einstein and Alan Turing as intellectual giants of the 20th century. I still feel that way, but  rather more so now, having recently read up on some of the latest thinking in theoretical physics and cosmology - one of the more promising strands of thought now treats the whole physical universe as being made from information, with matter and energy only appearing as side-effects. 

The starting point for this theory was the study of black holes, and the question which they raise about how much information can be crammed into a given region of space or quantity of matter. For example, how much information is required to describe the whole universe, and could that all be compressed into the memory of a computer, as many sci-fi virtual reality fantasies would require? It turns out that there are absolute limits on the amount of information that can be so stored, and that this 'world in a grain of sand' fantasy is impossible. The argument revolves around the concept of entropy.

In physics entropy is used to measure the randomness or disorderliness of some system, defined as the number of different states the system could be in at any given time - for example all the possible arrangements of the molecules in a sample of gas (which is a very large number indeed). For Shannon on the other hand entropy was a measure of the amount of information contained in a message, counted in bits. It turns out that these two quantities are conceptually related - think of the Shannon entropy as the length of the email you'd need to send to Maxwell's demon to instruct him to shuffle the molecules of the gas into a specified one of those different arrangements. However these two measures of entropy, when converted into the same units, differ very greatly in size. A silicon RAM chip containing a gigabyte of data has a Shannon entropy of precisely 8x10^9 bits, but its thermodynamic entropy is many orders of magnitude larger, around 10^23. This is because the Shannon entropy only cares about whether each RAM cell contains a 0 or 1, but the physical entropy cares about the state of every silicon atom and electron that makes up every transistor. Only when we arrive at chip fabrication techniques where every individual atom stores a bit would the two quantities become equal.    

Or rather they wouldn't, because of course atoms are made up of quarks (which may be made of superstrings and so on) so that 10^23 is in fact still a massive underestimate, based on observing the system only at the atomic level of complexity at which engineers and chemists operate. However at temperatures much less than that of the Big Bang the arrangement of quarks doesn't change, so in practice ignoring that level is workable. Will we eventually be able to make chips that use one quark or one superstring per bit? This is where the postulated limit on information storage arises, and it was discovered by physicists studying gravity in black holes.

A black hole represents an absolute limit on the density to which matter can be compressed, and the area of its event horizon - the boundary beyond which no information can escape from the hole - is what sets an absolute limit on the storage density of information. When matter falls into a black hole its mass and angular momentum get added to that of the black hole, but what happens to its entropy? Steven Hawking and Demetrious Christodolou proved in 1970 that the area of the event horizon of a black hole can never decrease, while Jakob Bekenstein postulated that the entropy of a black hole is proportional to this area, and that when matter falls in, the entropy of the hole increases to compensate for the entropy lost from the outside world, thus preserving the Second Law of Thermodynamics. Rafael Sorkin has calculated the relationship between black hole entropy and horizon area - entropy turns out to be one quarter of the area expressed in Planck units (10^-33 centimeter square, a fundamental unit of length in quantum mechanics). This makes the entropy of black holes rather large: a one centimetre diameter hole could hold around 10^66 bits.

Here's where the limit comes in: if a chunk of matter of surface area A collapses to a black hole, its area must end up less than A, and since the entropy cannot decrease, the original entropy it contained cannot therefore have been more than A/4. Of course I've greatly oversimplified this argument, which needs to be modified for an expanding universe full of matter and energy, but that's the bottom line. However what's really mind-boggling about all this is that the maximum possible information capacity should depend not on volume but on surface area, which is pretty counter-intuitive. There's a way of making sense of it though, using the Holographic Principle suggested by Nobel-laureate Gerard 't Hooft and Leonard Susskind: in essence, the physics of a region of space of N dimensions can be completely defined by a physical theory defined only on the N-1 dimensional boundary of the region, the way a two-dimensional hologram contains the description of a three-dimensional image. The study of space-time may soon have to give way to the study of pure information exchange.

No comments:

Post a Comment

CHINA SYNDROME

Dick Pountain /Idealog 357/ 08 Apr 2024 01:09 Unless you live permanently as an avatar in Second Life [does that even still exist?] then it ...