Wednesday 1 July 2015

THE EMOTION GAME

Dick Pountain/Idealog 245/05 December 2014 11:02

In Viewpoints last month Nicole Kobie fairly skewered ("Good at PCs? It doesn't mean you're bad with people") Hollywood's sloppy assumption that Alan Turing must have been autistic because he was a mathematical genius who didn't like girls. I almost didn't go to see "The Imitation Game" for a different reason - the sensational trailer that seemed to be trying to recruit Turing into the James Bond franchise - but I forced myself and was pleasantly surprised that although it took some liberties with the facts, it did grippingly convey the significance of Bletchley Park to the war effort. The movie's major "economy with the truth" lay in excluding GPO engineer Tommy Flowers, who actually built the kit and wrestled with those wiring looms that Turing was portrayed as doing alone. (It also lumped together two generations of hardware, the "Bombes" and Colossus, and barely even attempted to explain Turing's seminal paper on computable numbers, but those I excuse as they'd have hugely slowed the pace).

The film doesn't mention Asperger Syndrome - just as well since it was unknown in Turing's lifetime, and we now have to call it autistic spectrum disorder anyway - but as Nicole pointed out Cumberbatch's depiction of Turing was clearly based on modern notions about the stunting of emotional expression and social interaction that comprise that disorder. The plot depends heavily upon Turing overcoming the dislike his coldness provokes in the other team members, assisted of course by the token emotionally-literate woman played by Keira Knightley, and the tragic ending shows Turing being chemically castrated by injections of female hormone. And that combination of emotions with hormones set me off to read between the lines of The Imitation Game's script to a deeper meaning which the writer may or may not have intended.

The film is named after a test of machine intelligence that Turing invented, in which the machine must try to imitate human conversation sufficiently well to fool another human being, on the assumption that language is the highest attribute of human reason. However recent research in Affective Neuroscience has revealed the astonishing extent to which reason and emotion are totally entangled in the human mind. The weakness of the whole AI project, of which Turing was a pioneer, lies in failing to recognise this, in its continuing attachment to 18th-century notions of rationalism. Those parts of our brain that manipulate language and symbols are far from being in ultimate control, and are more like our mind's display than its CPU. I am, therefore I think, some of the time. US neuroscientist Jaak Panksepp has uncovered a collection of separate emotional operating systems in the brain's limbic system, each employing a different set of neurotransmitters and hormones. These monitor and modulate all our sensory inputs and behaviour, the most familiar examples being sexual arousal (testosterone and others), fight/flight (adrenaline) and maternal bonding (oxytocin), but there are at least four more and counting. What's more it's now clear that motivation itself is under the control of the dopamine reward system: we can't do *anything* without it, and its failure leads to Parkinsonism and worse. Now add to this the findings of Antonio Damasio, who claims all our memories get tagged with the emotional state that prevailed at the time they were recorded, and that our reasoning abilities employ these tags as weightings when making all decisions.

These lines of study suggest two things: firstly all rationalist AI is doomed to fail because the meaning of human discourse is permeated through and through with emotion (if you think about it, that's why we had to invent computer languages, to exclude such content); and secondly AI-based robots will never become wholly convincing until they mimic not only our symbolic reasoning system but also our hormonally-based emotional systems. Sci-fi authors have known this for ever hence their invention of biological androids like those in Bladerunner, with real bodies that mean they have something at stake - avoiding death, finding dinner and a mate (a bit like the IT Crowd). Steven Hawking's recent grim warnings about AI dooming our species should be tempered by these considerations: however "smart" machines get at calculating, manipulating and moving, their actual *goals* still have to be set by humans, and it's those humans we need to worry about.

So as well as a great deal of pleasure from its serious treatment of Turing, the two big lessons I took away from The Imitation Game were these:  machines will never be truly intelligent until they can feel as well as think (which would depend as much on advances in biology as solid-state physics and software engineering); and it would be nice if they were to start planning an "Imitation Game 2: The Tommy Flowers Story".

No comments:

Post a Comment

SOCIAL UNEASE

Dick Pountain /Idealog 350/ 07 Sep 2023 10:58 Ten years ago this column might have listed a handful of online apps that assist my everyday...