Wednesday 8 January 2020

ENERGY DRINKERS

Dick Pountain/ Idealog 299/ 7th June 2019 09:52:38

If you enjoy programming as much as I do, you're likely to have encountered the scenario I'm about to describe. You're tackling a really knotty problem that involves using novel data types and frequent forays into the manuals: you eventually crack it, then glance at the clock and realise that hours have passed in total concentration, oblivious to all other influences. You feel suddenly thirsty, rise to make a cup of tea and feel slightly weak at the knees, as if you'd just run a mile. That's because using your brain so intensively is, energetically, every bit as demanding as running: it uses up your blood glucose at a ferocious rate. Our brains burn glucose faster than any of our other organs, up to 20% of our total consumption rate.

How come? The brain contains no muscles, doesn't do any heavy lifting or moving: all it does is shift around images, ideas and intentions which surely must be weightless? Not so. I'll admit that a constant refrain of mine in this column has been that the brain isn't a digital computer in the way the more naive AI enthusiasts believe, but that's not to say that it therefore isn't a computing device of a very different (and not fully understood) kind - it most certainly is, and computing consumes energy. Even though a bit would appear to weigh nothing, the switching of silicon gates or salty-water neurons requires energy to perform and is less than 100% efficient. Rather a lot of energy actually if a lot of bits or neurons are involved, and their very tiny nature makes it easy to assemble very, very large numbers of them in a smallish space.

This was brought home to me yesterday in a most dramatic way via the MIT Technology Review (https://www.technologyreview.com/s/613630/training-a-single-ai-model-can-emit-as-much-carbon-as-five-cars-in-their-lifetimes) which describes recent research into the energy consumption of state-of-the-art Natural Language Processing (NLP) systems, of the sort deployed online and behind gadgets like Alexa. Training a single really large deep-learning system consumes colossal amounts of energy, generating up to five times the CO2 emitted by a car (including fuel burned) over its whole lifetime. How could that be possible? The answer is that it doesn't all happen on the same computer, which would be vaporised in a millisecond. It happens in The Cloud, distributed all across the world in massively-parallel virtual machine arrays working on truly humongous databases, over-and-over-again as the system tweaks and optimises itself.

We've already had a small glimpse of this fact through the mining of BitCoins, where the outfits that profit from this weirdly pathological activity have to balance the millions they mine against equally enormous electricity bills, and must increasingly resort to basing their servers in the Arctic or sinking them into lakes to water cool them. Yes indeed, computing can consume a lot of energy when you have to do a lot of it, and the deceptive lightness of a graphical display hides this fact from us: even live-action-real-shoot-em-up games nowadays demand multiple supercomputer-grade GPUs.

It was a hero of mine, Carver Mead, who first made me think about the energetics of computing in his seminal 1980 book 'Introduction to VLSI Systems'. Chapter 9 on the 'Physics of Computational Systems' not only explains, in thermodynamic terms, how logic gates operate as heat engines, but also employs the 2nd Law to uncover the constraints on consumption for any conceivable future computing technology. In particular he demolished the hope of some quantum computing enthusiasts for 'reversible computation' which recovers the energy used: he expected that would use more still.

The slice of total energy usage that goes into running the brains of the eight billion of us on the planet is way less than is used for transport or heating, thanks to six billion years of biological evolution that forged our brain into a remarkably compact computing device. That evolution changed our whole anatomy, from the bulging brain-case to the wide female pelvis needed to deliver it, and it also drove us to agriculture - extracting glucose fast enough to run it reliably forced us to invent cooking and to domesticate grass seeds.

Now AI devices are becoming important to our economies, and the Alexa on your table makes that feel feasible, but vast networks of energy-guzzling servers lie behind that little tube. Silicon technology just can't squeeze such power into the space our fat-and-protein brains occupy. Alongside the introduction of the electric car, we're about to learn some unpleasant lessons concerning the limits of our energy generation infrastructure.

[Dick Pountain's favourite energy drink is currently Meantime London Pale Ale]


No comments:

Post a Comment

SOCIAL UNEASE

Dick Pountain /Idealog 350/ 07 Sep 2023 10:58 Ten years ago this column might have listed a handful of online apps that assist my everyday...