Saturday 23 July 2022

MORE MEMORABLES

Dick Pountain /Idealog 329/ 04 Dec 2021 02:59

Last year I wrote here about three technical breakthroughs that were announced during November 2020 – an mRNA vaccine against Covid-19, Alphafold 2 solving the ‘protein folding problem’, and Apple’s M1 CPU. I’d like to repeat this exercise for November 2021, and though my picks for this year’s breakthroughs are equally profound – one in neuroscience, one in space exploration and one in quantum computing – they’re perhaps a little more difficult to explain.

The neuroscience discovery is simple enough to state: the human neuron appears to be several orders of magnitude more computationally-complex than silicon ones used to emulate it. Given each neuron’s number of axon connections this means the whole brain is almost infinitely more complex than any current AI. A team at the Hebrew University of Jerusalem trained an artificial deep neural network to mimic the computations of a simulated biological neuron, and the result suggests that the AI net requires between five and eight ‘hidden layers’ to represent the complexity of a single biological neuron. 

Deep-learning networks have created the current explosion in AI abilities like driverless cars, Alexa, Google Translate, Deepfold and so on. They all employ several layers – each containing a large number of silicon ‘neurons’ – between their input and output layers, which when trained on millions of examples configure themselves into a ‘black-box’ that classifies input into outputs. How many hidden layers are appropriate remains a keen matter of research: more isn’t always better, and the right number tends to be in single figures. Now it turns out that the classificatory power of a single human neuron is equivalent to a many-layered AI net. Our brain runs these mostly on coffee and chocolate digestives, while training a large deep-learning AI net consumes so much electrical power it’s becoming  a threat to the climate.   

At the other end of the distance scale, we’ve all thrilled to those colour photos of galaxies and nebulae captured by the Hubble Telescope. It’s so sensitive it can collect light from far-off objects which left not far after the Big Bang and has been travelling for billions of years since. Not close enough to the Big Bang however to answer some of the most important questions in cosmology. Thanks to an expanding universe the very oldest objects get red-shifted into the infrared, while the Hubble works with visible light. As I write we’re just two weeks away from the launch of the James Webb Space Telescope which works in the IR and if successful will push that vision back far enough to perhaps rewrite our story about the origins of the universe.  

I say ‘if’ because the Webb incorporates the most ambitious space engineering ever. A house-sized mirror, made from hexagonal tiles of toxic beryllium was polished to the greatest smoothness ever achieved, then folded up to fit into the rocket. In space it has to unfold correctly and each tile be constantly realigned by computers. And it needs to work at close to 0°K so it must travel to the Lagrange Point where earth, sun and a moon’s gravity cancel out, then unfurl a vast, flimsy sunshade made of five layers of metallised film. Superb engineering, but the best of luck is still required... 

Richard Feynman once prophesied that a use for quantum computers would be simulating reality at quantum level, and in this year’s breakthrough a quantum simulator has solved a real physics problem. A ‘quantum spin liquid’ is a novel state of matter in which a  magnetisable substance can’t take on a fixed polarity because its unusually threefold crystal symmetry makes domain pairing impossible. There’s a mineral, marvellously named Herbertsmithite, which may exhibit this state, giving it interesting and perhaps useful optical and magnetic properties, but directly studying quantum entanglement inside a sample is impossible. 

Simulating its lattice on a conventional computer takes months, but teams at Harvard and MIT recently simulated it in reasonable time using an unfashionable kind of quantum computer that employs neutral atoms rather than the superconductors used by Google and IBM. This isn’t a general-purpose computer: it builds a lattice of neutral atoms with the right symmetry using laser beams both to place the atoms and to switch their so-called Rydberg states to make them act as qubits. Neutral atom qubits are longer-lived and more robust against environmental decoherence than superconductors, and this success may refocus attention on the technology.

This last breakthrough tickles me for a different, less exalted reason though: “Herbertsmithite, named after the mineralogist Herbert Smith (1872–1953) is generally found in and around Anarak, Iran, hence its other name, anarakite”. We all know that kryptonite can counteract the powers of Superman, so perhaps anarakite might have a similar protective property against online computer super-nerds.

[Dick Pountain would like an anarakite amulet for Christmas]

MOLTEN METAPHOR

Dick Pountain /Idealog 328/ 05 Nov 2021 09:03

The little-known corollary to Moore’s Law is that computer columnists have to mention it in a column at least once a decade, and I am no exception. The miracle of digital search tells me that I’ve covered it here in 1996, 2004, 2005, 2009 and 2019 (the numerate among you will deduce that it’s a pretty rough sort of law). That first date is significant because it was in PC Pro issue 18, two years before the most popular piece I ever wrote for Byte, ‘The Limits Of Computing’ in 1998, about how and when Moore’s Law would end. In that piece I talked about the difficulty of getting down to 0.1 micron feature sizes in silicon CPUs: it would require short wavelength UV light from exotic krypton/fluorine lasers to illuminate the masks whose reduced images create the circuit patterns via the lithographic fabrication process. Getting much smaller would require X-rays which are intractable for many reasons.

Skip forward one quarter of a century, Moore’s Law is still at work and Intel and Apple’s latest chip families are pushing feature sizes down toward 10 nanometres (0.01 microns in old money). They do that by using EUV (Extreme UV) illumination, the last stop before those dreaded X-rays, and these are very, very difficult indeed to generate. I found myself getting interested in fab tech again, and decided to read up on EUV. 

There’s only one firm in the world that makes the EUV machines for this chip generation. It’s a Dutch firm called ASML (nothing at all to do with ASMR) and you can have one of their machines for somewhere around $180,000,000 depending on what bell and whistles you need. You’ll need to rent a new garage, around the size of a football field, and a couple of cranes to house it, and your electricity bill may rise. The reason for this price and size is that 13.5nm EUV radiation is totally absorbed by both glass and air, so everything has to be performed in the highest possible vacuum and with mirrors rather than lenses. The mirrors, by Zeiss, are the flattest ever made, flatter even than the ones in space telescopes. A whole array of vacuum pumps employ little rotors spinning at 30,000rpm that bat air molecules away one at a time like ping-pong balls. Oh, and you can’t use any old incandescent lamp to generate these rays. In fact the only way they found to generate 13.5nm EUV is by blasting a tiny drop of molten tin with two hits from a powerful laser microseconds apart, which explodes it into a tiny ball of glowing plasma. Over and over again, like a sort of hot, tinny ink-jet printer. Phew.

I’ve always been fascinated by molten metal. I love the stuff, perhaps because my father worked in the steel industry. As a small boy I watched Bessemer Converters blowing, the best firework display in the world, from just far enough so I didn’t catch fire (health-and-safety was loose back then). At home I melted down all my lead soldiers in a pan on the gas stove, poured them into two small alloy jelly moulds in the shape of tortoises which became our doorstops for years (I did spill at bit on the kitchen floor, which remained a burned-in silver splatter for years too). 

My favourite movie was ‘The Hunchback of Notre Dame’ in which Charles Laughton as Quasimodo, stands on the roof of a Gothic cathedral chanting “Molten Metal! Molten Metal” while he pours boiling lead onto the soldiers below. Studying chemistry in London I was often faced with molten metal, often sodium, not always on purpose. When later I became a magazine publisher I used to visit marvellous old printing plants in the East End that still employed Linotype machines, in which a small vat of molten lead gets poured into the lines of lead type. 

You get the picture. I just find the image of some sort of expensive ink-jet printer that fires drops of molten tin that get blasted into plasma very, very powerful indeed. It’s a bit of a cliché to compare the silicon chip industry to the building of Gothic cathedrals, in both effort and ambition, but ASML’s EUV machine makes it a very tempting metaphor. It’s certainly as costly, and getting on for a similar sized floor-plan. Instead I’ll deflate the rhetorical atmosphere slightly with a more humble metaphor. Predicting the end of Moore’s Law is very like making a long car journey with a small child, who keeps asking “Are We There Yet!, Are We There Yet!”. And I can now tell you that the answer is a resounding “Not Yet!...Soon!”.    

SOCIAL UNEASE

Dick Pountain /Idealog 350/ 07 Sep 2023 10:58 Ten years ago this column might have listed a handful of online apps that assist my everyday...