Monday 2 July 2012

I SECOND THAT EMOTION

Dick Pountain/Thu 15 July 2004/5:09 pm/Idealog 120

No-one has recently accused me of being a 'touchy-feely' sort of person. It's not that I have anything against touching in the right places, nor am I afraid of feelings - I'm just repelled by the cult of emotional incontinence that seems to be spreading like a plague from a focus in Southern California. When I hear some of the Oscar speeches I sometimes get a lump in my throat, but it's only my breakfast coming up. I cringe when anyone says 'Thank you for sharing that' without irony (which is very seldom in the UK) and was utterly appalled when all the depressive sociopaths in Eastenders started 'being there for' one another.

Leaving my phlegmatic Northern upbringing aside, I think my objection to this cult is that it completely misrepresents the real nature of the emotions. Touchy-feely people seem to imagine emotions as a kind of mental faeces that must be expelled at all costs lest you become constipated, and the regrettable result is all too often emotional diarrhea (much of modern fiction is mental Syrup of Figs). One truth about the emotions is that they're robustly physical entities, intimately connected with animal survival, and another truth is that they're not the same thing as feelings. This last error is particularly important in a computing context, if we are ever to see computers that possess consciousness. 

All this has been brought to a head for me recently after reading the works of Professor Antonio Damasio, a brilliant Portugese/American neurophysiologist who has come as close as anyone yet to a satisfactory explanation of how consciousness works - and guess what, he thinks that its roots lie in the emotions, rather than in Reason as most modern thinkers, following Descartes, have believed.
       
Damasio's chain of reasoning, brutally compressed, goes something like this. Emotions, which date back to the very earliest stages of evolution, are the motor of all animal behaviour: they provide the mechanism by which a creature reacts to harmful or helpful stimuli in its external environment, and they also alter its internal environment appropriately for the current situation (for example to prime it for flight, fight, feeding or sex). Emotional processes are triggered by inputs from whatever senses the creature possesses, which they convert into chemical signals (hormones, neurotransmitters) that may affect all its organs - you could imagine them as the BIOS of the animal operating system. The notion that you can repress an emotion is almost a nonsense: for the vast majority of people they're quite unstoppable (and in any case begin below the level of consciousness). What you *can* learn to do is to suppress the facial expression of emotional states - the blush of embarassment, the clench of rage - but the underlying chemical processes are not halted.

Even neurologically simple creatures like molluscs and insects possess emotions, which are step above reflexes (wholly mechanical reactions, like the jerk of your finger from a hot surface). However in the higher animals the chemical broadcast mechanism of emotion is augmented by a set of specific neural patterns that represent these emotional states within the brain's model of its own body - these are Feelings, such pain, pleasure, anger, fear, joy. It's Damasio's contention that consciousness arose in evolution from the processing of these feeling signals, in particular once an organism becomes able to *know* that it's feeling a feeling: it's this second-order effect that creates the sense of self.

All of this has enormous implications for the possibility of computer consciousness. Much of the absurd over-optimism of strong artificial intelligence enthusiasts stems from their confusing of Consciousness with Reason. Reasoning involves manipulating symbols according to abstract systems of rules, and human beings are far better at it than other animals thanks to our possession of language. Computers can also be made pretty good at it, and they can do it a lot faster than us. (As Daniel Dennett has pointed out, we in effect reason in a slow, software-only virtual machine running on top of a fast, efficient hardware processor designed for throwing spears). Consciousness is something else altogether, and as Damasio demonstrates, it stems originally from having a body as well as a mind.

Now computers do in fact have a sort of body - just like animals they need a source of energy (their PSU), and they have finite resources to manage like RAM, disk space and communication channels. They are also beset by external threats like viruses and spam, and internal threats like hardware failure and software bugs At the moment these all have totally separate mechanisms to handle them, most of which are appallingly primitive (even the humblest ameoba couldn't survive with an operating system that stops everything when it encounters an error). What's needed is to build into the operating system of the future a complete model of the computer's own 'body' and then feed all information from the UPS, processor heat monitor, disk drivers, AV software and so on, into this model: you could then give it 'emotions' such as 'hunger' for RAM or disk space, 'fear' of virus invasion, 'anger' at spam, that trigger processes to correct these errant conditions. It's odd to think that real-time controllers fitted with watchdogs, big commercial transaction monitors and non-stop/hot-swap servers are actually somewhat closer to true consciousness than the most advanced AI reasoning systems are. I think I'd want to draw a line somewhere though: I wouldn't want to create computers that can get as annoyed with us as we do with them...

No comments:

Post a Comment

SOCIAL UNEASE

Dick Pountain /Idealog 350/ 07 Sep 2023 10:58 Ten years ago this column might have listed a handful of online apps that assist my everyday...