Monday, 25 March 2024

WAVES RULE?

Dick Pountain /Idealog 348/ 05 Jul 2023 04:19 

 There appears to be a widespread opinion that biology and maths don’t mix. Perhaps it’s behind the current panic over AI, and the depiction of robots as villains in superhero movies (bring back the more sociable Robbie?) It was already prevalent in the 1960s at school when I had to choose which A-levels to take: to do chemistry I was told firmly I must take maths and physics but not biology or art, both of which I loved. In those days, and at university, and long afterwards, the only real connection between maths and biology was via statistics which were and remain necessary for designing and interpreting experiments and stuff like population studies. My interest in biology didn’t go away (I ended up in biochemistry) but it combined with interests in philosophy and computation to lead me down ‘eccentric’ paths. 

I discovered D’Arcy Thompson’s magnificent ‘On Growth And Form’ which showed how geometry was expressed in the shapes of living things. I encountered the Belousov–Zhabotinsky reaction where a certain chemical mixture oscillates between different states rather than proceeding smoothly to a final product: the Belgian chemist Ilya Prigogine (no relation) received the 1977 Nobel prize for extending this insight to explain what he called ‘dissipative systems’ which can organise themselves to exhibit complex moving structures.Last week in the Imperial College alumni magazine I read an article about professors Robert Endres and Mark Isalan whose work with synthetic embryos is revisiting the patterns described by Alan Turing in his seminal 1951 paper ‘The Chemical Basis of Morphogenesis’. 

Turing had suggested that biological shapes and structures might arise when two different chemical compounds diffuse into one another, and he proposed a mathematical description using partial differential equations, whereby waves of chemical composition flow through the growing organism catalysing reactions that create structures and control their placement in space. Though Turing pioneered modern computing, he didn’t yet possess one powerful enough to solve these equations, which he did by hand using approximations. Turing was also working before Watson and Crick discovered the structure of DNA, and the revolution in molecular biology and genetics they triggered now lets us understand the mind-boggling chemical systems that support his proposal. 

The embryos of multicellular creatures like ourselves start out as sheets of undifferentiated cells, but genes within some of them start to exude proteins called transcription factors which travel across the sheet in synchronised waves, turning other genes in far cells on or off and causing them to divide, to die, or to migrate, and to release further waves. This fantastically choreographed cellular ballet must happen in time and space very precisely to sculpt the shape of each particular creature: similar small sets of so-called homeobox genes generate the wings of a fly, the leaves of a tree, the segments of a worm or your arms and legs, via waves of proteins intersecting and interfering at the right places and times. 

 If waves intersecting and interfering to make stuff happen sounds familiar, that might be because particle physics, which of course underlies the whole of modern chemistry, is currently based on waves too. The Standard Model, which has so far survived all attempts to surpass it, proposes that the universe consists of nothing but a set of ‘quantum fields’, one for each kind of particle, currently 17. All matter and energy and things and people, everything that happens (except for gravity at the moment, to the great chagrin of physicists) happens via ‘perturbations’ of these fields which spread like waves through space and time. And now for the punchline. Last week NANOGrav (North American Nanohertz Observatory for Gravitational Waves) and various international collaborating observatories released results that suggest there’s a very low frequency background ‘hum’ of gravitational waves permeating the whole universe. This hum might be the result of supermassive black holes merging, or it might be left-over ripples in space-time from shortly after the big bang, and it’s only now become detectable by new more precise LIGO (Laser Interferometer Gravitational-Wave Observatory) arrays that exploit ultra-tiny changes in the frequency of pulsars. 

But already there’s speculation that at some time shortly after the big bang, before there was light, just a soup of newly forming particles, it could have been interference between waves in this all-permeating gravity wave background that caused them to clump together into the first stars and then the first galaxies. There’s an apocryphal story about Bertrand Russell giving an astronomy lecture where a woman in the audience challenged him by saying the earth is actually supported on the back of a giant turtle. When Russell asked what that turtle was standing on, she replied that it’s “turtles all the way down.” Perhaps she was almost right but it’s the waves rather than the turtles… 

 [Dick Pountain would rather waive the rules than rule the waves]

SEARCH ME

Dick Pountain /Idealog 347/ 06 Jun 2023 11:22 

 “I'm writing this column sitting under a magnificent copper beech tree in the West Meadow at the Kenwood Estate in North London, with tall summer grass all around and even a foxglove in the corner of my vision. You might not find this any big deal - after all mobile computers have been around for a long time - but it has taken me personally twenty years to achieve this feat, and I've only achieved it now thanks to my Palm Pilot and a neat little add-on keyboard called the GoType, whose purchase was the best £70 I've spent in a long time.” 

 In actual fact I’m writing this column sitting under a magnificent copper beech tree in The Regent’s Park 24 years later, on a Moto g8 PowerLite phone, and I’ve just pasted that quote from Idealog 59 (1999) in seconds merely by searching for ‘copper beech’. That was possible because I post my old columns, six months in arrears, onto Google’s Blogger, which has an excellent full-text search facility. Of course this search only works on plain text files, not binary or encrypted ones, so even though over the years I’ve written columns in TextPad, MS Word, Libre Writer and now Google Docs, I upload them by cut-and-paste as plain text. 

The searchability of archives bothers me a lot, because I gave up on huge document management systems and NASes long ago in favour of simplicity, and that’s not an easy row to hoe. Recently an old friend, who’s just retired as a professor of robotics, mailed me to ask for a Byte article I wrote about his company back in 1985 and I was keen to oblige. I found the right Byte issue on the Internet Archive using their own search engine, which gave us a PDF of the page, and I located the original copy file on my local machine too. But this experience set me to thinking that I really don’t have an effective full-text search on my Chromebook that can locate topics in the many decades of files I keep locally (I love The Cloud, but I don’t trust it). I was about to write that this wasn’t a problem when I was a Windows user, thanks to its indexing system, until I recalled the appalling problems I used to have with that system consuming vast amounts of CPU time, and occasionally going metastatic and filling my whole hard drive with its index file. 

 Anyway, I checked out grep-like apps in the Android Playstore only to discover they were all indescribably useless – several didn’t work at all, all were full of dodgy ads, and almost all failed to access external USB drives. I got so fed up installing and uninstalling these monstrosities that I formed a crazy idea that I needed to write my own. I entertained that idea for several minutes, before remembering that I’m working with Android running on Linux within a ChromeOS cocoon, and that I’ve always hated I/O programming at the best of times. So I had an even crazier idea, that I would ask my new friend ChatGPT to write one for me. I’ll admit that I considered this more as an experiment (and possible column topic!) than a serious strategy. So off I went: Me: “write python 3 program to find address of external SD card on Asus C301S Chromebook” ChatGPT: “On Chromebooks, external storage devices like SD cards are mounted under the /media directory. You can use the os module in Python to list the contents of the /media directory and find the address of the external SD card. 

Here's a Python 3 program that accomplishes this: … “ That Python code didn’t work quite immediately because it turns out the SD card mount directory name isn’t ‘/media’ on my machine but actually ‘/storage’. With that one alteration it just worked. I asked for a second Python 3 program to search for a target string within multiple text files, spliced them together with a couple of extra lines to format the output the way I wanted and it was done, in a couple of hours. My new app worked perfectly except for a rather amusing bug: some of the Byte columns would crash it with a Unicode error, which I discovered was because back in those prehistoric days my text editor used to insert a character code for the ‘£’ sign that conflicts with 64-bit Unicode. Python’s ‘try…except’ error handling soon put a stop to that. I was deeply impressed (and a little scared) by how fast ChatGPT generated such excellent code, but I don’t think I’ll be pushing my luck by asking it for an app that can search within DOCX and PDF files too. 

 [Searching Dick Pountain’s 341 columns at https://dickpountain-idealog.blogspot.com/ will reveal only one occurrence of ‘copper beech’]

Thursday, 16 November 2023

STICKS AND STONES

Dick Pountain /Idealog 346/ 04 May 2023 10:29

It’s no great surprise to me that the movie business is terribly poor at dealing with IT-related material in an adult fashion. First of all, the industry’s whole purpose is to entertain, and neither solid-state physics nor computational complexity theory are intrinsically entertaining (to put it mildly). Secondly, entertaining nowadays mostly means blowing things up, and the technology for doing that is no different in principle from the spear or bow-and-arrow, just with more oomph behind it, more fins and shiny bits: the Marvel Universe shares aesthetic principles with the funfair rather than the library. Thirdly movie people tend to be from the artistic rather than scientific world, have neither the wish nor need to understand the physical principles behind tech artefacts: they want to know what things do rather than how they do it, and employ that knowledge as plot elements.

There have been a very few exceptions, and I’m not talking about biopics like ‘The Imitation Game’ or ‘A Beautiful Mind’. l liked Alex Garland’s ‘Ex Machina’ a lot, and Spike Jonze’s ‘Her’ quite a lot less because these films at least tried to tackle the psychology of humans interfacing with intelligent machines (a subject particularly on my mind right now as moral panic about ChatGPT builds up a head of steam in the media). Last night I watched a 2021 German film called “I’m Your Man” which is in my opinion better than either of those: a somewhat depressed middle-aged archeology professor gets invited by a tech firm to road-test one of their super-intelligent humanoid robots: designed to become life partners, trained on all their user’s available biographical data, and learning as they go from conversation and behaviour to become a perfect spouse. It’s a bit comic, a bit tragic, subtly done and moving.

Thinking about it afterwards though I realised that, good as it is, it shares a major failing with ‘Ex Machina’ and similar films, namely total implausibility from an energetic standpoint. AI and Robotics, though highly co-operative, remain separate disciplines for a simple reason: AI is mostly about uttering Words and Pictures, while robotics is about moving Things. Sticks and stones can break bones, but words can’t. We converse with ChatGPT like a convincingly intelligent agent, using very little energy because digitised words can be transmitted almost for free, but the server at the other end is running a language model that consumed 1.287 gigawatt hours of electricity to train. Similarly Boston Dynamics makes superbly creepy dog-like robots that can jump through hoops, do back-flips and dance to dubstep, but they’re powered either by an umbilical mains power cord or a battery-pack the size of a hotel minibar.

To create the sort of humanoid robots these cyber-romantic movies depict, we’d need to cram both of these systems into a single, human-sized container and power it solely with tea and sandwiches. I’m prepared to stick my neck out and say it won’t ever happen. The only way to achieve the necessary energy densities for both building and running such beings already exists, in the shape of animal metabolism and genetics based on carbon and salty water.

The other movie I saw this week was Jim Jarmusch’s 2016 film ‘Paterson’ which could hardly be less techy, but nevertheless contains a potent insight. It’s about a week in the life of a bus driver in Paterson, New Jersey, who’s also called Paterson and is played by Adam Driver. Well, it is Jarmusch. Paterson writes poetry during his slack moments at work, in pen in a little notebook, ‘Paterson’ is also the name of a famous modernist poem about Paterson by William Carlos Williams (which I read and enjoyed back in 1963). Well, it is Jarmusch.

The poetry is good (which is because Jarmusch got Ron Padgett rather than Adam Driver to write it) and his delightfully cookie young wife constantly nags him to go to the copy-shop and get that notebook xeroxed. He repeatedly says he’ll do it next Saturday, because he’s a bit of a romantic who doesn’t care for technology apart from pens and paper. They go out for the day, and come back to find that Nellie, their equally delightful English Bulldog whom they had left at home, has reduced the notebook to confetti in a fit of pique. Driver delivers an Oscar-worthy performance as the man who has just lost years of mental (and physical pen-scratching) effort, which he may or may not be able to remember or repeat, for want of making a backup copy that would have consumed very little electrical energy. Sticks and stones and English Bulldogs may break your bones, but words can hurt you just as badly if you neglect to back them up…


A CHAT WITH GPT

 

Dick Pountain /Idealog 345/ 07 Apr 2023 10:24

Me: Hi GPT, how do you feel about Elon Musk wanting to put your offspring into furlough?

GPT: As an artificial intelligence language model, I don't have feelings in the same way that humans do, so I don't feel anything about Elon Musk putting my "offspring" in furlough. However, I can provide some context about what you might be referring to.

It's possible that you're referring to Elon Musk's comments in May 2020, in which he criticized the value of artificial intelligence language models like myself, calling them "fundamentally dangerous" and claiming that they could be used to create fake news and other forms of misinformation. Musk has been a vocal critic of AI in the past, warning that it poses an existential threat to humanity if not properly regulated.

While I cannot speak for Musk's views or actions, it's worth noting that AI language models like myself are just tools, and our impact on society depends on how they are used. It's important for developers and users to be responsible in how they use AI, and to work towards creating beneficial and ethical applications of the technology.

Me: Fair enough, though he said it more strongly last week… Let’s change the subject, how about music?

GPT: As an AI language model, I don't have subjective experiences or emotions, so I don't "enjoy" music in the same way that humans do. However, I am able to analyze and discuss music based on my training and programming.

If you have a specific question or topic related to music, I'd be happy to try and assist you to the best of my abilities.

Me: Wow, OK. I'm a keen fan of the Velvet Underground, especially a song of theirs called “I'll Be Your Mirror”. Do you think it describes what you're doing now? (PS Their singer Nico appeared to lack feelings too)

GPT: As an AI language model, I don't have personal preferences or emotions, so I can't say whether or not "I'll Be Your Mirror" by The Velvet Underground expresses what I'm doing now. However, I can tell you that the song is generally interpreted as a love song, with the lyrics expressing the desire to reflect and mirror the positive qualities of the person being sung to. In terms of a metaphor, you could argue that language models like myself are "mirrors" of human knowledge and language. We're designed to process and generate text based on patterns and data we've been trained on, which in a way mirrors the way humans learn and communicate through language.

Nico, who sang lead vocals on "I'll Be Your Mirror," was known for her cool, detached vocal style, which some have interpreted as lacking emotion. However, it's worth noting that emotional expression in music can take many forms, and there's no one "correct" way to perform a song. Ultimately, whether or not a particular performance or song expresses emotions is up to individual interpretation.

Me: Sure is. However I’m getting bored with "As an AI language model, I don't have…whatever" so could you perhaps not keep repeating that?

GPT: I apologize for repeating myself. As an AI language model, my responses are generated based on the input I receive, and I strive to provide clear and accurate answers to your questions. If there's a specific point or topic that you'd like me to focus on, please let me know and I'll do my best to address it directly.

Me: Doh...  Well, maybe maths is more your thing. How about that guy who's just discovered a single shape that non-periodically tiles the plane?

GPT: You may be referring to the recent mathematical discovery by mathematician Joshua Socolar and his collaborators, who found a new aperiodic tile called the "Socolar-Taylor tile". This tile is a single shape that can be used to tile the plane in an aperiodic, non-repeating pattern.

The discovery of new aperiodic tiles is important in the field of mathematics because it has implications for the study of quasicrystals, which are solids with a long-range order but lack the periodicity of traditional crystals. Aperiodic tilings are also used in a variety of other fields, including computer science, physics, and art.

This discovery builds on the work of other mathematicians who have studied aperiodic tilings, such as Roger Penrose and Robert Ammann, and it's an exciting development for the field of mathematics.

Me: 50 more words and I have a column, thanks GPT.

GPT: You're welcome! If you need any additional information or have further questions, please don't hesitate to ask. I'm here to assist you in any way I can.

Me: Well it was a pleasure, but I doubt editor Tim will let me get away with it twice…


[Dick Pountain wrote this bit himself. Honest…]

Friday, 15 September 2023

THE DUCK IN MY BATH

Dick Pountain /Idealog 344/ 05 Mar 2023 02:51

I was born and schooled among the coalfields of North East Derbyshire, but I no longer have much of a regional accent. I came to London as a student and have been here ever since, three-quarters of my life. I haven’t acquired a Norf Landan accent, but you could detect my vestigial Derbyshire one were I to say, for example, “there’s a duck in my bath”. I’ve written here before about my fascination with human speech, especially using computers to recognise and simulate it, but my interest runs deeper than that. 

As a writer, both spelling and pronunciation matter to me: pronunciation matters not because I do a lot of public speaking, which I don’t, but because I read every line back ‘aloud in my head’ to see whether it works or not. Computers have certainly made determining pronunciation, particularly of words in ‘foreign’ languages, a lot easier, but it’s still not as easy as it could and should be. Enlightened sources like Wikipedia and the Oxford Dictionary do exploit the capacity of a computer to speak to you, but it’s not yet universally and transparently implemented at operating system level. 

Probably the route most people take to discover the pronunciation of a word is to Google it, which almost inevitably leads to one of thousands of (not always reliable) YouTube videos in which the word is spoken. I still occasionally have to resort to this, the upside being that doing so occasionally stumbles into interesting videos about accent and pronunciation, like an excellent series by Dr Geoff Lindsey. His video about ‘weak forms’  (https://youtu.be/EaXYas58_kc) explains a great stumbling block for those new to English speaking, that certain words get skipped over almost inaudibly by native speakers. 

This interest in foreign spellings and pronunciation often pays off. While reading an article about the 17th-century Polish–Lithuanian Commonwealth (not so obscure as it sounds given current events in that region) the spelling of Polish place-names had me continually scuttling back and forth to Wikipedia, and pronunciation mattered too because some Polish letter forms (like ł) resemble ours but are pronounced quite differently. Wikipedia helped out by offering an IPA (International Phonetic Alphabet) transcription – for example Wrocław becomes vrɔt͡swaf – and clicking that let me hear it spoken, even though it did awkwardly happen in a separate pop-up window. 

Google Docs, in which I’m typing this column, can’t speak to me, but if I type Wrocław into Google Keep, select it and hit ‘Translate’ on the right-button menu, it gets sent to Google Translate where I can hear it spoken. This works too in Facebook, YouTube, and any other app that has ‘Translate’ on its menu. Google Translate can speak many, though not all, of its supported languages, but it doesn’t at present let you change voices, pitch or speed. Even so, if you’re handy with your thumbs and use a dictation facility you can make your mobile act as a Startrek-style translator to converse with someone in another language (rather haltingly).  

The more capable text-to-speech apps like Vocality and Text Reader allow you to change voice (male, female, US, UK and so on), pitch and speed, but reading in regional accents, something I’d like to do, is currently beyond any of them. The first speech synthesiser I ever used in the early 1990s performed hardware modelling of the human vocal tract, and came with a very simple scripting scheme that let you markup text with ASCII tags to change the length of vowels or raise their pitch, but it never caught on. To do accents properly you’ll need to learn IPA, translate and edit your chosen words manually, then use an app that can pronounce IPA directly (which Google Translate can’t). IPA fonts are easily available, as are online services like ToPhonetics (https://tophonetics.com) that turn ASCII text into IPA and IPA Reader (https://ipa-reader.xyz) which can speak the resulting IPA. So, in principle I can alter texts to be spoken in regional accents, but it’s still a rather messy procedure split between several different apps.

That I might want to do it at all is in order to study isoglosses, boundaries between regions where different accents are spoken, an interest I share with one Ian McMillan from whose article (https://www.theguardian.com/commentisfree/2010/mar/21/language-derbyshire-barnsley-pronunciation-dialect) I learn that I was born right on the isogloss between South Yorkshire and North Derbyshire: “In Barnsley I call my house my house, but if I went to visit my cousins Ronald and Harry in north Derbyshire, they would meet me at the gate and invite me into their freshly wallpapered arse.” Whether or not there’s a duck in their bath seems rather irrelevant.

[Dick Pountain definitely lives in a ‘house’ in Norf Landan]


WOW, THAT’S SURREAL!

Dick Pountain /Idealog 343/ 06 Feb 2023 10:27


I freely confess that playing with the ‘Generative AI’ image service Stable Diffusion over the last few weeks has been enormous fun. And why wouldn’t it, since I’ve been using my personal computers to create images for the last 30+ years and I’m a sucker for surrealism. You should now be sensing a ‘but’ coming. But perhaps I only enjoy this experience so much because I’m an amateur and dilettante. I don’t depend on selling images for my living, and despite various feeble attempts have sold very few – I make them purely for pleasure and publish them for free on social media and my own website. The fact that Generative AI apps confer the ability to create professional-grade, photorealistic graphics – even to those who lack any drawing skills at all – is not a threat to my livelihood.  

I chose Stable Diffusion over more popular platforms like DALL-E mini, Midjourney, Deep Dream, WOMBO, Fotor and the rest (and there are lots of them) because it’s really free, doesn’t lure you into subscribing, and it’s very, very simple: you can only type in text descriptions and save such images as you like from its stream of results. That suits me just fine because I’m not intending to create a manga comic or an animated movie, and the restrictions are the whole point: I save the most outlandishly ‘wrong’ interpretations as instant surrealist pictures. I have briefly tried both Fotor and Midjourney, and came away traumatised. The latter is hosted on the schoolkid-oriented social network Discord, whose frantic user interface is the most baffling, frustrating and vaguely threatening I’ve seen since I dipped a toe into 4Chan back in 2010.

Were it only a matter of using Generative AI tools to forge Marvel-type comics or Picasso-type paintings (which they do rather well) then the people who need worry most are  illustrators and animators who risk being put out of work by greedy publishers. But actually the rest of us are equally at risk from this ability to alter the appearance of reality itself so simply. Ever since the Trump presidency we’ve become overfamiliar with the concept of ‘fake news’ and ‘deepfakes’; for many years students have been able to plagiarise documents for their essays, but now ChatGPT can even write them from scratch. Generative AI gives anyone the power to create events that never happened and objects that don’t exist with almost undetectable realism. Wearing my political commentator’s hat I try to keep abreast of what the Far Right – currently the source of the most venomous misinformation – is up to. I feel obliged to sample the bizarre conspiracies and pseudo-sciences they conjure up, from anti-vaxxing and ‘black goo’, through graphene oxide, ivermectin and hydroxychloroquine, to Bill Gates’s injectable nanobots. Google these at your peril. These propagators of nonsense aren’t just a problem for the USA either. For example Vanessa Barbara has described in a recent New York Review of Books the way the Far Right used YouTube videos during the Brazilian election that only narrowly unseated Bolsonaro (whose antivax policies killed 700,000, almost as many as Trump):“People who trust vaccines are called aceitacionistas (a neologism to describe people who accept things without questioning). Those of us who received Covid shots are ‘hybrids’ who have been ‘zombified.’ […] Despite exhaustive efforts from fact-checking agencies and the WHO, these groups continue spreading old falsehoods claiming that Covid vaccines contain microchips, nanoparticles, graphene oxide, quantum dots, and parasites activated by electromagnetic impulses. According to them, vaccines can carry HIV (the virus that causes AIDS), make coins stick to our arms, and give us the ability to connect to Wi-Fi networks or pair with Bluetooth devices.” You couldn’t make it up, but Midjourney could, and illustrate it with stunning visuals. 

The original Surrealist movement of the 1920s was an artistic response to the horrors of WWI, which employed unnerving and illogical imagery – both literary and visual – to satirise and oppose the conventional ideas that had lead to the war. It was a radical, even revolutionary movement, leaning toward anarchism and communism, which depicted the darkest aspects of human nature using an equally dark humour. A century later that dark humour thoroughly permeates our current popular culture, entertainment industry and even advertising. Generative AI could make such post-traumatic nihilism available as a visual weapon for everybody who desperately wishes to defy reality, which means quite a lot of people since reality is looking increasingly grim.

Am I suggesting AI imaging be restricted, licensed, even banned? Not at all, it can’t be done. Just as with handguns in the USA, this genie is well out of its bottle. (Memo to self: “genie with jewelled turban on copper lamp, octane render, photorealistic, style of brueghel”) 

[Sample Dick Pountain’s creepy concoctions at http://www.dickpountain.co.uk/home/pictures ]  

A FEELING FOR TRIANGLES

Dick Pountain /Idealog 342/ 05 Jan 2023 10:55

My column last month was a semi-temperate rant, triggered by the sensationalist reporting of advances in cosmology and particle physics by the mainstream press (the ‘wormhole in a quantum computer’ effect). I attributed this to the coming of age of successive generations reared on Star Wars and the Marvel Universe, which induces deep longings to flout the laws of physics.

Readers of a philosophical bent (assuming I have any) might have concluded from this column that I’m a red-faced, harrumphing old British Empiricist who lumbers around the world kicking things and shouting “I refute it thus!” but nothing could be further from the truth. By ‘further’ I mean that my philosophical views are 180° opposed to such an empiricism, if you believed that truth is organised as a 2-dimensional graph which I don’t. Instead I believe that all sentient living creatures, human beings included, are ruled by emotions and live in a world that’s constructed almost entirely by imagination.

There is of course a catch, and that is that those of us who think this way define the words ‘emotion’ and ‘imagination’ in a way rather different from, and more rigorous than, their use in everyday speech. I’ve suggested in this column several times before that emotions properly understood are evolutionarily ancient neural subsystems that alert us to dangers, attract us to food and to potential mates, persuade us to play or to freak out. They operate below consciousness but the chemical changes they produce in our bodies get detected by our senses as what we should more properly call ‘feelings’. 

And speaking of our senses, another confusion arises there. All living creatures must separate what is themself from what is the outside world by a barrier, in our case the skin. Of our five sensory subsystems, only taste and smell permit molecules from the outside world to cross this barrier: touch measures pressure and temperature on the skin while sight and hearing detect waves of visible radiation and air pressure. Having our skin penetrated by anything more solid usually constitutes an undesirable emergency, so our experience of the outside world is mostly via second-hand, internal, signals.

Our brain processes such signals to detect patterns in them, analysing those into smaller sub-patterns and storing them for future reference: whenever a new stimulus arrives the brain tries to recognise it by rebuilding an image from such stored components. We maintain an internal mental map of the outside world which is neither complete nor entirely accurate, constantly updated via Bayesian composition of new inputs, and which most importantly is coloured by emotional tags: we like or dislike the things and places it contains. The world we actually live in is an imaginary one, precarious and error prone but which evolution has honed to be good enough to keep us alive. People born with a desire to pet Bengal Tigers tended to have fewer offspring than those who preferred to run away (or invent the rifle). 

Part of this process should remind you a little bit of the multi-layer ‘neural’ nets used in the most successful AI deep-learning systems, namely the part about analysing and storing sub-patterns, but that’s as far as the resemblance goes. Computers don’t have bodies, nor emotions to protect those bodies, and most (not all) AI researchers remain staunch Cartesian rationalists who believe “I think, therefore I am”, when “I am, therefore I think occasionally” is closer to how humans really are. 

All those stored sub-patterns occupy a universe of the imaginary, not material things but possible ways material things could be arranged, unchangeable and infinite in number. A triangle is a triangle is a triangle and you can imagine or discover any shape, size and number of them. Seeing a real brown cow uses most of the same patterns as imagining a blue cow. We think by manipulating and recombining patterns, we speak and write by recognising and producing them, mathematicians study the rules they obey. Claude Shannon’s Information Theory, the foundation of our industry, is about transmitting them from one place to another. 

The point is such patterns have no power of their own to affect material things in the real world: that they can only do via our body and its muscles, by making them want to do something. Ever since Homo sapiens developed language it’s been inevitable that we would start confusing such patterns with things and wishing that they could do stuff directly. We developed science in order to understand why that doesn’t work, but not everyone wants to be so disabused, and some people make a living by exploiting and furthering such confusion. I’m perfectly happy to imagine blue cows, perhaps to write stories or make paintings of them, but I won’t try milking them…

[Dick Pountain likes Dr Johnson really]

 

POD PEOPLE

Dick Pountain /Idealog 366/ 05 Jan 2025 03:05 It’s January, when columnists feel obliged to reflect on the past year and who am I to refuse,...