Thursday 3 March 2022

Q’S GADGETS


                      Dick Pountain /Idealog 323/ 06 Jun 2021 09:10

Random House Business recently sent me a copy of WIRED magazine’s little guide book ‘Quantum Computing: how it works and why it could change the world’ by Amit Katwala, and coming from such a source I felt it deserves mention here. Good news is that it’s an excellent introduction to the current state and prospects for quantum computing – not too technical, wasting little space on the basics everyone already knows now (superposition, entanglement, decoherence) but avoiding the hieroglyphics of quantum algorithms. At barely 6,000 words it’s very short, and that’s because it’s surprisingly honest about the fact that quantum computers barely exist today and that their prospects remain rather dim.

In chapter 2 Katwala offers a whirlwind summary of the three major current research directions – laser ion-traps (Amazon/IonQ), cryogenic Josephson junctions (Google and IBM), and ‘topological qubits’ (Microsoft) – which he explains in a clear, readable style. As he goes he continually points to their weaknesses: ion-traps need too many lasers to be scalable; cooling to 0.01°K requires monstrous cryostats that consume lots of energy; and Microsoft’s trick for dodging decoherence would depend on the existence of a still-undiscovered fundamental particle! (That’s rather hard on Microsoft, since a recently discovered cerium/ruthenium/tin ‘Weyl-Kondo’ semi-metal alloy might render this phantom particle unnecessary).

Katwala is admirably candid about the problem that all three strands of research share, that environmental noise causes qubits to rapidly untangle, leaving barely microseconds in which to perform useful computation. This decoherence also renders the results unreliable, and so an enormous degree of error-correction is required: each working qubit must be surrounded by dozens of error-correction qubits (and whenever an error is detected these error qubits themselves must be error-corrected). Google’s Sycamore chip, which is claimed to have achieved ‘quantum supremacy’, contains 53 qubits, but most of those would have been doing error-correction. This is the reason physicist John Preskill, one of the leading researchers, dubbed this the NISQ (noisy intermediate scale quantum) era, where quantum computers exist but aren’t yet robust enough to fulfill their promise. Harsher critics suspect that the 2nd Law of Thermodynamics may be gnawing away so that quantum computing may be inherently unfeasible.

That hardly matters though, because the quantum computing bandwagon has become unstoppable now that politicians and soldiers are involved. The strong promise of quantum computing, namely an exponential speed increase over classical computers, threatens to make public-key encryption, as used by the military, the banks, even by WhatsApp, crackable. This makes it a matter of national security, unlocking unlimited funding and starting a new Cold War-style arms race between China and the West. However Katwala is equally candid that this strong promise is itself quite dubious: the problem classes for which quantum algorithms are known that give exponential speed-up are surprisingly few. It turns out the best quantum algorithms known for commercially-important optimisation problems like Travelling Salesman offer only a quadratic, not exponential, advantage over classical computers. Which isn’t peanuts – reducing a one million step calculation to one thousand may be the difference between overnight and almost real-time, which City traders would happily pay for – but it won’t satisfy the crypto-crowd nor justify such huge research budgets.

I’d certainly recommend Katwala’s book as a quick read to bring you up to speed with mainstream quantum thinking, but it doesn’t cover any radically different ‘long-shot’ directions. I’m personally convinced that if quantum computing happens it will only be through room-temperature, solid-state technologies that are barely here yet, and I’m also an enthusiast for neuromorphic computing architectures that mimic the nervous systems of animals, using electronic components that might employ hybrid digital-and-analog computations.

Neuromorphic engineering was first pursued by one of my heroes, Carver Mead in the late 1980s, for designing vision systems, auditory processors and autonomous robots. The convolutional neural networks that are driving the recent explosion in commercial AI and machine-learning are only one aspect of a far wider domain of neuromorphic computing models, and they obviously employ classical computing components. Deep-learning networks are becoming a source of concern over the colossal amount of power they consume when training on enormous data sets, so this is an area where even a ‘mere’ quadratic speed advantage would be very welcome indeed.

US researchers at Purdue have shown how ‘spintronic’ quantum measurements might be used to implement neuromorphic processors, and many groups are investigating spin switching in ‘nitrogen-vacancies’ within synthetic diamond lattices – diamond-based qubits might resist decoherence for milli- rather than micro- seconds, and at room temperature. Were I writing a science-fiction screenplay, my future quantum computers would be alternating sandwiched layers (like a Tunnock’s Caramel Wafer) of epitaxially-deposited diamond and twisted graphene, read and written by a flickering laser inside a dynamic, holographic magnetic field.

[Dick Pountain is well aware of the addiction risk posed by Tunnock’s Caramel Wafers]








No comments:

Post a Comment

SOCIAL UNEASE

Dick Pountain /Idealog 350/ 07 Sep 2023 10:58 Ten years ago this column might have listed a handful of online apps that assist my everyday...