Tuesday 14 April 2020

I SECOND THAT EMOTION

Dick Pountain/ Idealog302/ 2nd September 2019 10:24:10

Regular readers might have been surprised when I devoted my previous two columns to AI, a topic about which I’ve often expressed scepticism here. There’s no reason to be, because it’s quite possible to be impressed by the latest advances in AI hardware and software while remaining a total sceptic about the field’s more hubristic claims. And a book that reinforces my scepticism has arrived at precisely the right time, ‘The Strange Order of Things’ by Antonio Damasio. He’s a Portuguese-American professor of neuroscience who made his name researching the neurology of the emotions, and the critical role they play in high-level cognition (contrary to current orthodoxy).

Damasio also has a deep interest in the philosophy of consciousness, to which this book is a remarkable contribution. Damasio admires the 17th-century Dutch philosopher Baruch Spinoza, who among his many other contributions made a crucial observation about biology – that the essence of all living creatures is to create a boundary between themselves and the outside world within which they try their hardest to maintain a steady state, the failure of which results in death.  
Spinoza called this urge to persevere ‘conatus’, but Damasio prefers the more modern name ‘homeostasis’.

The first living creatures – whose precise details we don’t, and may never, know – must have emerged by enclosing a tiny volume of the surrounding sea water with a membrane of protein or lipid, then controlling the osmotic pressure inside to prevent shrivelling or bursting. And they became able to split this container in two to reproduce themselves. Whether one calls this contatus or homeostasis, it is the original source of all value. When you’re trying to preserve your existence by maintaining your internal environment, it becomes necessary to perceive threats and benefits to it and to act upon them, so mechanisms that distinguish ‘good’ from ‘bad’ are hard-wired not merely into the first single celled creatures, but into every cell of multicellular creatures – ourselves included – that evolved from them. The simplest bacteria possess ‘senses’ that cause them to swim toward food or away from harmful chemicals, or to and from light, or whatever.

Damasio’s work traces the way that, as multicellular creatures evolved, not only does this evaluation mechanism persist in every individual cell, but as more complex body forms evolved into separate organs, then nervous systems, then brains, this evaluation became expressed in ever higher-level compound systems. In our case it’s the duty of the limbic system within our brain, which controls what we call ‘emotions’ using many parallel networks of electrical nerve impulses and chemical hormone signals. And Damasio believes that our advanced abilities to remember past events, to predict future events, and to describe things and events through language, are intimately connected into this evaluatory system. There’s no such thing as a neutral memory or word, they’re always tagged with some emotional connotation, whether or not that ever gets expressed in consciousness.

And so at last I arrive at what this has to do with AI, and my scepticism towards its strongest claims. Modern AI is getting extraordinarily good at emulating, even exceeding, our own abilities to remember, to predict and describe, to listen and to speak, to recognise patterns and more. All these are functions of higher consciousness, but that isn’t the same thing as intelligence. AI systems don’t and can’t have any idea what they are for, whereas in our body every individual cell knows what it’s for, what it needs (usually glucose), and when to quit. You have five fingers because while in your mother’s womb, the cells in between them killed themselves (look up ‘apoptosis’) for the sake of your hand.

It’s not so much the question of whether robots could ever feel – which obsesses sci-fi authors – but of whether any AI machine could ever truly reproduce itself. Every cell of a living creature contains its own battery (the mitochondrion), its own blueprint (the DNA) and its own constructor (the ribosomes), it’s an ultimate distributed self-replicating machine honed by 3.5 billion years of evolution. When you look at an AI processor chip under the microscope it may superficially resemble the cellular structure of a plant or animal with millions of transistors for cells, but they don’t contain their own power sources, nor the blueprints to make themselves, and they don’t ‘know’ what they’re for. The chip requires an external wafer fab the size of an aircraft hangar to make it, the robot’s arm requires an external 3D printer to make it. As Damasio shows so brilliantly, the homeostatic drive of cells permeates our entire bodies and minds, and those emotional forces that rationalists reject as ‘irrational’ are just our body trying to look after itself.

INTELLIGENCE ON A CHIP

Dick Pountain/ Idealog 301/ 4th August 2019 12:59:28

I used to write a lot about interesting hardware back in the 1980s. I won a prize for my explanation of HP’s PA-RISC pipeline, and a bottle of fine Scotch from Sir Robin Saxby for a piece on ARM’s object-oriented memory manager. Then along came x486, the CPU world settled into a rut/groove, and Byte closed. I never could summon the enthusiasm to follow Intel’s cache and multi-core shenanigans.

Last month’s column was about the CogX AI festival in King’s Cross, but confined to software matters: this month I’m talking hardware, which to me looks as exciting as in that ‘80s RISC revolution. Various hardware vendors explained the AI problems their new designs are intended to solve, which are often about ‘edge computing’. The practical AI revolution that’s going on all around us is less about robots and self-driven cars than about phone apps.

You’ll have noticed how stunningly effective Google Translate has become nowadays, and you may also have Alexa (or one of its rivals) on your table to select your breakfast playlist. Google Translate performs the stupendous amounts of machine learning and computation it needs on remote servers, accessed via The Cloud. On the other hand stuff like voice or face recognition on your phone are done by the local chipset. Neither running Google Translate on your phone, nor running face recognition in the cloud make any sense because bandwidth isn’t free, and latency is more crucial in some applications than in others. The problem domain has split into two – the centre and the edge of the network.

Deep learning is mostly done using convolutional neural networks that require masses of small arithmetic operations to be done very fast in parallel. GPU chips are currently favoured for this job since, although originally designed for mashing pixels, they’re closer to what’s needed than conventional CPUs. Computational loads for centre and edge AI apps are similar in kind but differ enormously in data volumes and power availability, so it makes sense to design different processors for the two domains. While IBM, Intel and other big boys are working to this end, I heard two smaller outfits presenting innovatory solutions at CogX.

Nigel Toon, CEO of Cambridge-based Graphcore, aims at the centre with the IPU-Pod, a 42U rack-mount board that delivers 16 PetaFLOPs of mixed-precision convolving power. This board holds 128 of Graphcore’s Colossus GC2 IPUs (Intelligence Processor Unit) each delivering 500TFlops and running up to 30,000 independent program threads in parallel in 1.2GB of on-chip memory. This is precisely what’s needed for huge knowledge models, keeping the data as close as possible to the compute power to reduce bus latency – sheer crunch rather than low power is the goal.

Mike Henry, CEO of US outfit Mythic was instead focussed on edge processors, where low power consumption is absolutely crucial. Mythic has adopted the most radical solution imaginable, analog computing: their IPU chip contains a large analog compute array, local SRAM memory to stream data between the network’s nodes and a single-instruction multiple-data (SIMD) unit for processing operations the analog array can’t handle. Analog processing is less precise than digital – which is why the computer revolution was digital – but for certain applications that doesn’t matter. Mythic’s IPU memory cells are tunable resistors in which computation happens in-place as input voltage turns to output current according to Ohm’s Law, in effect multiplying an input vector by a weight matrix. Chip size and power consumption are greatly reduced by keeping data and compute in the same place, hence wasting less energy and real-estate in A-to-D conversion. (This architecture may be more like the way biological nerves work too).

To effectively use the extra intelligence these chips promise we’ll need more transparent user interfaces, and the most exciting presentation I saw at CogX was ‘Neural Interfaces and the Future of Control’ by Thomas Reardon (who once developed Internet Explorer for Microsoft, but has since redeemed himself). Reardon is a pragmatist who understands that requiring cranial surgery to interface to a computer is a bit of a turn-off (we need it like, er, a hole in the head) and his firm Ctrl-Labs has found a less painful way.

Whenever you perform an action your brain sends signals via motor neurons to the requisite muscles, but when you merely think about that action your brain rehearses it by sending both the command and an inhibitory signal. Ctrl-Labs uses a non-invasive electromyographic wristband that captures these motor neuron signals from the outside and feeds them to a deep-learning network – their software then lets you control a computer by merely thinking the mouse or touchscreen actions without actually lifting a finger. Couch-potato-dom just moved up a level.

SOCIAL UNEASE

Dick Pountain /Idealog 350/ 07 Sep 2023 10:58 Ten years ago this column might have listed a handful of online apps that assist my everyday...