Dick Pountain/PC Pro/Idealog 209 - 12/12/2011
It shouldn't come as any huge surprise to regular readers if I admit being agnostic with respect to the newly-founded cult of Jobsianism. Agnostic rather than atheist, because I did greatly admire some of Steve Job's qualities, most particularly his taste for great design and his uncanny skill at divining what people would want if they saw it, rather than what they want right now. On the other hand I found Jobs' penchant for proprietory standards, monopoly pricing, patent trolling and "walled-garden" paternalism deeply repugnant - and to judge from Walter Isaacson's new authorised biography I wouldn't have much cared to drink beer or drop acid with him. In my own secular theology Jobs will now be occupying a plateau somewhere on the lower slopes of Mount Nikon (which used to be called Mount Olympus before the Gods dumped the franchise in protest at that accounting scandal) alongside other purveyors of beautiful implements like Leo Fender and Enzo Ferrari.
So which figures from the history of computing would I place higher up the slopes of Mt Nikon? Certainly Dennis Ritchie (father of the C language) and John McCarthy (father of Lisp) both of whom died within a week or so of Jobs and whose work helped lead programming out of its early primitivism - there they could resume old arguments with John Backus (father of Fortran). But on far higher ledge, pretty close to the summit, would be the extraordinarily talented physicist Richard Feynman, whom not very many people would associate with computing at all. I've just finished reading his 1985 book, which I had somehow overlooked, called "QED: The Strange Theory of Light and Matter", a collection of four public lectures he gave in New Zealand and California explaining quantum electrodynamics for a popular audience. The book amply demonstrates Feynman's brilliance as a teacher who applied sly humour and inspired metaphors to explain the most difficult of subject, er, matter. He cleverly imagines a tiny stopwatch whose second hand represents the phase of a travelling photon, and through this simple device explains all the phenomena of optics, from reflection and refraction to the two-slit quantum experiment, more clearly than I've ever seen before. But what does this have to do with computers?
Well, having finished his book in a warm glow of new-found understanding, I was prompted to take down a book I've mentioned here before, by Carver Mead the father of VLSI (very large scale integrated circuit) technology and an ex-student of Feynman's at Caltech. Mead's "Collective Electrodynamics" defends the wave view of sub-atomic physics preferred by Einstein but rejected by Niels Bohr (and the majority of contemporary physicists), using ideas about photon absorption taken directly from Feynman. But, once again, what does this have to do with computers? Quite a lot actually. In his introduction to Collective Electrodynamics Mead offers an anecdote from the early 1960s which includes these remarks:
"My work on [electron] tunnelling was being recognized, and Gordon Moore (then at Fairchild), asked me whether tunnelling would be a major limitation on how small we could make transistors in an integrated circuit. That question took me on a detour that was to last nearly 30 years, but it also lead me into another collaboration with Feynman, this time on the subject of computation." Mead presented their preliminary results in a 1968 lecture and "As I prepared for this event, I began to have serious doubts about my sanity. My calculations were telling me that, contrary to all the current lore in the field, we could scale down the technology such that *everything got better*" In fact by 1971 Mead and Feynman were predicting Moore's Law, from considerations of basic quantum physics.
Now utopian predictions about the potential power of quantum computers are the flavour of this decade, but it's less widely appreciated that our humble PCs already depend upon quantum physics: QED, and its sister discipline QCD (quantum chromodynamics), underly all of physics, all of chemistry, and actually all of everything. The band gap of silicon that makes it a semiconductor and enables chips to work is already a quantum phenomenon. The first three of Feynman's lectures in "QED" are mostly about photons, but his last chapter touches upon "the rest of physics", including Pauli's Exclusion Principle. Electrons are such touchy creatures that at most two of opposite spins can ever occupy the same state, a seemingly abstract principle which determines the ways that atoms can combine, that's to say, all of chemistry, cosmology, even biology. It's why stone is hard and water is wet. Stars, planets, slugs, snails, puppy dog's tails, all here thanks to the Exclusion Principle, which is therefore as good a candidate as any for the bounteous creative principle in my little secular theology. Its dark sworn enemy is of course the 2nd Law of Thermodynamics: in the end entropy or chaos must always prevail.
It seems I've reinvented a polytheistic, materialistic version of Zoroastrianism, a Persian religion from around 600BC. At the centre of my theology stands cloud-capped Mount Nikon, its slopes teeming with great minds who advanced human understanding like Aristotle, Spinoza and Nietschze, with ingenious scientists like Einstein and Feynman, and lower down with talented crazies who gave us beautiful toys, like Steven Jobs.
My columns for PC Pro magazine, posted here six months in arrears for copyright reasons
Subscribe to:
Post Comments (Atom)
CHINA SYNDROME
Dick Pountain /Idealog 357/ 08 Apr 2024 01:09 Unless you live permanently as an avatar in Second Life [does that even still exist?] then it ...
-
Dick Pountain/Idealog 262/05 May 2016 11:48 The ability to explain algorithms has always been a badge of nerdhood, the sort of trick peopl...
-
Dick Pountain /Idealog 340/ 10 Nov 2022 09:53 I live in Camden Town, close to The Regent’s Canal down which I can walk in 10 minutes to King...
-
Dick Pountain /Idealog 344/ 05 Mar 2023 02:51 I was born and schooled among the coalfields of North East Derbyshire, but I no longer have mu...
No comments:
Post a Comment