Wednesday 8 January 2020

TIME SERVED

Dick Pountain/ Idealog 300/ 5th July 2019 07:51:08

My 300th column is a milestone that deserves a meaty subject, and luckily the recent CogX 2019 festival provides a perfect one. I reckoned I’d served my time in the purgatory that is the computer show: for 15 years I spent whole weeks at CeBIT in Hannover, checking out new tech for Byte magazine; I’ve done Comdex in Vegas (stayed in Bugsy Siegel’s horrendous Flamingo where you walked through a mile of slots to get to your room); I’ve flown steerage to Taipei to help judge product awards at Computex. I thought I was done, but CogX had two irresistible attractions: it brought together the world’s top AI researchers for three days of serious talk, and it was in King’s Cross, a short walk down the canal from my house, in what’s now called ‘The Knowledge Quarter’ - British Library, Francis Crick Institute, Arts University (Central and St Martins rolled into one brand-new campus), YouTube HQ, with Google’s still under construction.

CogX was the first proper show to be held in the futuristic Coal Drops Yard development and it was big – 500 speakers on 12 stages, some in large geodesic tents. It was also silly expensive at £2000 for three-days-all-events or £575 per day (naturally I had a press pass). Rain poured down on the first day causing one wag to call it ‘Glastonbury for nerds’, but it was packed with standing-room-only at every talk I attended. A strikingly young, trendy and diverse crowd, most I imagine being paid for by red-hot Old Street startups. Smart young people who don’t aspire to be DJs or film stars now seem to aim at AI instead. This made me feel like an historic relic, doffing my cap on the way out, but in a nice way.

Perhaps you suspect, as I did beforehand, that the content would be all hype and bullshit, but you’d be very wrong. It was tough to choose a dozen from the 500. I went mostly for the highly techy or political, skipping the marketing and entreprenurial ones, and the standard of talks, panel discussions and organisation was very impressive. This wasn’t a conference about how Machine Learning (ML) or deep learning work, that’s now sorted. These folk have their supercomputers and ML tools that work, it’s about what they’re doing with them, and whether they should be and who’s going to tell them.

David Ferrucci (formerly IBM Watson, now Elemental Cognition) works on natural language processing and making ML decisions more transparent, using a strategy that combines deep-learning and database search with interactive tuition. Two women buy mint plants: one puts hers on her windowsill where it thrives, the other in a dark room where it doesn't. His system threw out guesses and questions until it understood that plants need light, and that light comes through windows. Second story: two people buy wine, one stores it in the fridge, the other on the windowsill where it spoils. More guesses, more questions, his system remembers what it learned from the mint story, deduces that light is good for plants but bad for wine. To make machines really smart, teach them like kids.

Professor Maja Pantić (Affective and Behavioural Computing at Imperial College, head of Samsung AI lab in Cambridge) told a nice story about autism and a nasty one about supermarkets. They’ve found that autistic children lack the ability to integrate human facial signals (mouth, eyes, voice etc), become overwhelmed and terrified. An AI robotic face can separate these signals into a format the child can cope with. On the other hand supermarkets can now use facial recognition software to divine the emotional state of shoppers and so change the prices charged to them on the fly. Deep creepiness.

Eric Beinhofer (Oxford Martin School) and César Hidalgo (MIT Media Lab) gave mind-boggling presentations on the way AI is now used to build colossal arrays of virtual environments – rat mazes, economic simulations, war games – on which to train other ML systems, thus exponentially reducing training times and improving accuracy. Stephen Hsu (Michigan State) described how ML is now learning from hundreds of thousands of actual human genomes to identify disease mutations with great accuracy, while Helen O’Neill (Reproductive and Molecular Genetics, UCL) said combining this with CRISPR will permit choosing not merely the gender but many others traits of unborn babies, maybe within five years. A theme that emerged everywhere was, ‘we are already doing unprecedented things that are morally ambiguous, even God-like, but the law and the politicians haven’t a clue. Please regulate us, please tell us what you want done with this stuff’. But CogX contained way too much for one column, more next month about extraordinary AI hardware developments and what it’s all going to mean.

Link to YouTube videos of my choices: https://www.youtube.com/playlist?list=PLL4ypMaasjt-_K14PNE6YQRQIlmBhHoZE

[Dick Pountain found the food in King’s Cross, especially the coffee, more interesting than Hannover Messe or Vegas, somewhat less so than snake wine in Taipei ]


ENERGY DRINKERS

Dick Pountain/ Idealog 299/ 7th June 2019 09:52:38

If you enjoy programming as much as I do, you're likely to have encountered the scenario I'm about to describe. You're tackling a really knotty problem that involves using novel data types and frequent forays into the manuals: you eventually crack it, then glance at the clock and realise that hours have passed in total concentration, oblivious to all other influences. You feel suddenly thirsty, rise to make a cup of tea and feel slightly weak at the knees, as if you'd just run a mile. That's because using your brain so intensively is, energetically, every bit as demanding as running: it uses up your blood glucose at a ferocious rate. Our brains burn glucose faster than any of our other organs, up to 20% of our total consumption rate.

How come? The brain contains no muscles, doesn't do any heavy lifting or moving: all it does is shift around images, ideas and intentions which surely must be weightless? Not so. I'll admit that a constant refrain of mine in this column has been that the brain isn't a digital computer in the way the more naive AI enthusiasts believe, but that's not to say that it therefore isn't a computing device of a very different (and not fully understood) kind - it most certainly is, and computing consumes energy. Even though a bit would appear to weigh nothing, the switching of silicon gates or salty-water neurons requires energy to perform and is less than 100% efficient. Rather a lot of energy actually if a lot of bits or neurons are involved, and their very tiny nature makes it easy to assemble very, very large numbers of them in a smallish space.

This was brought home to me yesterday in a most dramatic way via the MIT Technology Review (https://www.technologyreview.com/s/613630/training-a-single-ai-model-can-emit-as-much-carbon-as-five-cars-in-their-lifetimes) which describes recent research into the energy consumption of state-of-the-art Natural Language Processing (NLP) systems, of the sort deployed online and behind gadgets like Alexa. Training a single really large deep-learning system consumes colossal amounts of energy, generating up to five times the CO2 emitted by a car (including fuel burned) over its whole lifetime. How could that be possible? The answer is that it doesn't all happen on the same computer, which would be vaporised in a millisecond. It happens in The Cloud, distributed all across the world in massively-parallel virtual machine arrays working on truly humongous databases, over-and-over-again as the system tweaks and optimises itself.

We've already had a small glimpse of this fact through the mining of BitCoins, where the outfits that profit from this weirdly pathological activity have to balance the millions they mine against equally enormous electricity bills, and must increasingly resort to basing their servers in the Arctic or sinking them into lakes to water cool them. Yes indeed, computing can consume a lot of energy when you have to do a lot of it, and the deceptive lightness of a graphical display hides this fact from us: even live-action-real-shoot-em-up games nowadays demand multiple supercomputer-grade GPUs.

It was a hero of mine, Carver Mead, who first made me think about the energetics of computing in his seminal 1980 book 'Introduction to VLSI Systems'. Chapter 9 on the 'Physics of Computational Systems' not only explains, in thermodynamic terms, how logic gates operate as heat engines, but also employs the 2nd Law to uncover the constraints on consumption for any conceivable future computing technology. In particular he demolished the hope of some quantum computing enthusiasts for 'reversible computation' which recovers the energy used: he expected that would use more still.

The slice of total energy usage that goes into running the brains of the eight billion of us on the planet is way less than is used for transport or heating, thanks to six billion years of biological evolution that forged our brain into a remarkably compact computing device. That evolution changed our whole anatomy, from the bulging brain-case to the wide female pelvis needed to deliver it, and it also drove us to agriculture - extracting glucose fast enough to run it reliably forced us to invent cooking and to domesticate grass seeds.

Now AI devices are becoming important to our economies, and the Alexa on your table makes that feel feasible, but vast networks of energy-guzzling servers lie behind that little tube. Silicon technology just can't squeeze such power into the space our fat-and-protein brains occupy. Alongside the introduction of the electric car, we're about to learn some unpleasant lessons concerning the limits of our energy generation infrastructure.

[Dick Pountain's favourite energy drink is currently Meantime London Pale Ale]


SOCIAL UNEASE

Dick Pountain /Idealog 350/ 07 Sep 2023 10:58 Ten years ago this column might have listed a handful of online apps that assist my everyday...