My columns for PC Pro magazine, posted here six months in arrears for copyright reasons
Monday, 25 March 2024
WAVES RULE?
SEARCH ME
Thursday, 16 November 2023
STICKS AND STONES
It’s no great surprise to me that the movie business is terribly poor at dealing with IT-related material in an adult fashion. First of all, the industry’s whole purpose is to entertain, and neither solid-state physics nor computational complexity theory are intrinsically entertaining (to put it mildly). Secondly, entertaining nowadays mostly means blowing things up, and the technology for doing that is no different in principle from the spear or bow-and-arrow, just with more oomph behind it, more fins and shiny bits: the Marvel Universe shares aesthetic principles with the funfair rather than the library. Thirdly movie people tend to be from the artistic rather than scientific world, have neither the wish nor need to understand the physical principles behind tech artefacts: they want to know what things do rather than how they do it, and employ that knowledge as plot elements.
There have been a very few exceptions, and I’m not talking about biopics like ‘The Imitation Game’ or ‘A Beautiful Mind’. l liked Alex Garland’s ‘Ex Machina’ a lot, and Spike Jonze’s ‘Her’ quite a lot less because these films at least tried to tackle the psychology of humans interfacing with intelligent machines (a subject particularly on my mind right now as moral panic about ChatGPT builds up a head of steam in the media). Last night I watched a 2021 German film called “I’m Your Man” which is in my opinion better than either of those: a somewhat depressed middle-aged archeology professor gets invited by a tech firm to road-test one of their super-intelligent humanoid robots: designed to become life partners, trained on all their user’s available biographical data, and learning as they go from conversation and behaviour to become a perfect spouse. It’s a bit comic, a bit tragic, subtly done and moving.
Thinking about it afterwards though I realised that, good as it is, it shares a major failing with ‘Ex Machina’ and similar films, namely total implausibility from an energetic standpoint. AI and Robotics, though highly co-operative, remain separate disciplines for a simple reason: AI is mostly about uttering Words and Pictures, while robotics is about moving Things. Sticks and stones can break bones, but words can’t. We converse with ChatGPT like a convincingly intelligent agent, using very little energy because digitised words can be transmitted almost for free, but the server at the other end is running a language model that consumed 1.287 gigawatt hours of electricity to train. Similarly Boston Dynamics makes superbly creepy dog-like robots that can jump through hoops, do back-flips and dance to dubstep, but they’re powered either by an umbilical mains power cord or a battery-pack the size of a hotel minibar.
To create the sort of humanoid robots these cyber-romantic movies depict, we’d need to cram both of these systems into a single, human-sized container and power it solely with tea and sandwiches. I’m prepared to stick my neck out and say it won’t ever happen. The only way to achieve the necessary energy densities for both building and running such beings already exists, in the shape of animal metabolism and genetics based on carbon and salty water.
The other movie I saw this week was Jim Jarmusch’s 2016 film ‘Paterson’ which could hardly be less techy, but nevertheless contains a potent insight. It’s about a week in the life of a bus driver in Paterson, New Jersey, who’s also called Paterson and is played by Adam Driver. Well, it is Jarmusch. Paterson writes poetry during his slack moments at work, in pen in a little notebook, ‘Paterson’ is also the name of a famous modernist poem about Paterson by William Carlos Williams (which I read and enjoyed back in 1963). Well, it is Jarmusch.
The poetry is good (which is because Jarmusch got Ron Padgett rather than Adam Driver to write it) and his delightfully cookie young wife constantly nags him to go to the copy-shop and get that notebook xeroxed. He repeatedly says he’ll do it next Saturday, because he’s a bit of a romantic who doesn’t care for technology apart from pens and paper. They go out for the day, and come back to find that Nellie, their equally delightful English Bulldog whom they had left at home, has reduced the notebook to confetti in a fit of pique. Driver delivers an Oscar-worthy performance as the man who has just lost years of mental (and physical pen-scratching) effort, which he may or may not be able to remember or repeat, for want of making a backup copy that would have consumed very little electrical energy. Sticks and stones and English Bulldogs may break your bones, but words can hurt you just as badly if you neglect to back them up…
A CHAT WITH GPT
Dick Pountain /Idealog 345/ 07 Apr 2023 10:24
Me: Hi GPT, how do you feel about Elon Musk wanting to put your offspring into furlough?
GPT: As an artificial intelligence language model, I don't have feelings in the same way that humans do, so I don't feel anything about Elon Musk putting my "offspring" in furlough. However, I can provide some context about what you might be referring to.
It's possible that you're referring to Elon Musk's comments in May 2020, in which he criticized the value of artificial intelligence language models like myself, calling them "fundamentally dangerous" and claiming that they could be used to create fake news and other forms of misinformation. Musk has been a vocal critic of AI in the past, warning that it poses an existential threat to humanity if not properly regulated.
While I cannot speak for Musk's views or actions, it's worth noting that AI language models like myself are just tools, and our impact on society depends on how they are used. It's important for developers and users to be responsible in how they use AI, and to work towards creating beneficial and ethical applications of the technology.
Me: Fair enough, though he said it more strongly last week… Let’s change the subject, how about music?
GPT: As an AI language model, I don't have subjective experiences or emotions, so I don't "enjoy" music in the same way that humans do. However, I am able to analyze and discuss music based on my training and programming.
If you have a specific question or topic related to music, I'd be happy to try and assist you to the best of my abilities.
Me: Wow, OK. I'm a keen fan of the Velvet Underground, especially a song of theirs called “I'll Be Your Mirror”. Do you think it describes what you're doing now? (PS Their singer Nico appeared to lack feelings too)
GPT: As an AI language model, I don't have personal preferences or emotions, so I can't say whether or not "I'll Be Your Mirror" by The Velvet Underground expresses what I'm doing now. However, I can tell you that the song is generally interpreted as a love song, with the lyrics expressing the desire to reflect and mirror the positive qualities of the person being sung to. In terms of a metaphor, you could argue that language models like myself are "mirrors" of human knowledge and language. We're designed to process and generate text based on patterns and data we've been trained on, which in a way mirrors the way humans learn and communicate through language.
Nico, who sang lead vocals on "I'll Be Your Mirror," was known for her cool, detached vocal style, which some have interpreted as lacking emotion. However, it's worth noting that emotional expression in music can take many forms, and there's no one "correct" way to perform a song. Ultimately, whether or not a particular performance or song expresses emotions is up to individual interpretation.
Me: Sure is. However I’m getting bored with "As an AI language model, I don't have…whatever" so could you perhaps not keep repeating that?
GPT: I apologize for repeating myself. As an AI language model, my responses are generated based on the input I receive, and I strive to provide clear and accurate answers to your questions. If there's a specific point or topic that you'd like me to focus on, please let me know and I'll do my best to address it directly.
Me: Doh... Well, maybe maths is more your thing. How about that guy who's just discovered a single shape that non-periodically tiles the plane?
GPT: You may be referring to the recent mathematical discovery by mathematician Joshua Socolar and his collaborators, who found a new aperiodic tile called the "Socolar-Taylor tile". This tile is a single shape that can be used to tile the plane in an aperiodic, non-repeating pattern.
The discovery of new aperiodic tiles is important in the field of mathematics because it has implications for the study of quasicrystals, which are solids with a long-range order but lack the periodicity of traditional crystals. Aperiodic tilings are also used in a variety of other fields, including computer science, physics, and art.
This discovery builds on the work of other mathematicians who have studied aperiodic tilings, such as Roger Penrose and Robert Ammann, and it's an exciting development for the field of mathematics.
Me: 50 more words and I have a column, thanks GPT.
GPT: You're welcome! If you need any additional information or have further questions, please don't hesitate to ask. I'm here to assist you in any way I can.
Me: Well it was a pleasure, but I doubt editor Tim will let me get away with it twice…
[Dick Pountain wrote this bit himself. Honest…]
Friday, 15 September 2023
THE DUCK IN MY BATH
Dick Pountain /Idealog 344/ 05 Mar 2023 02:51
I was born and schooled among the coalfields of North East Derbyshire, but I no longer have much of a regional accent. I came to London as a student and have been here ever since, three-quarters of my life. I haven’t acquired a Norf Landan accent, but you could detect my vestigial Derbyshire one were I to say, for example, “there’s a duck in my bath”. I’ve written here before about my fascination with human speech, especially using computers to recognise and simulate it, but my interest runs deeper than that.
As a writer, both spelling and pronunciation matter to me: pronunciation matters not because I do a lot of public speaking, which I don’t, but because I read every line back ‘aloud in my head’ to see whether it works or not. Computers have certainly made determining pronunciation, particularly of words in ‘foreign’ languages, a lot easier, but it’s still not as easy as it could and should be. Enlightened sources like Wikipedia and the Oxford Dictionary do exploit the capacity of a computer to speak to you, but it’s not yet universally and transparently implemented at operating system level.
Probably the route most people take to discover the pronunciation of a word is to Google it, which almost inevitably leads to one of thousands of (not always reliable) YouTube videos in which the word is spoken. I still occasionally have to resort to this, the upside being that doing so occasionally stumbles into interesting videos about accent and pronunciation, like an excellent series by Dr Geoff Lindsey. His video about ‘weak forms’ (https://youtu.be/EaXYas58_kc) explains a great stumbling block for those new to English speaking, that certain words get skipped over almost inaudibly by native speakers.
This interest in foreign spellings and pronunciation often pays off. While reading an article about the 17th-century Polish–Lithuanian Commonwealth (not so obscure as it sounds given current events in that region) the spelling of Polish place-names had me continually scuttling back and forth to Wikipedia, and pronunciation mattered too because some Polish letter forms (like ł) resemble ours but are pronounced quite differently. Wikipedia helped out by offering an IPA (International Phonetic Alphabet) transcription – for example Wrocław becomes vrɔt͡swaf – and clicking that let me hear it spoken, even though it did awkwardly happen in a separate pop-up window.
Google Docs, in which I’m typing this column, can’t speak to me, but if I type Wrocław into Google Keep, select it and hit ‘Translate’ on the right-button menu, it gets sent to Google Translate where I can hear it spoken. This works too in Facebook, YouTube, and any other app that has ‘Translate’ on its menu. Google Translate can speak many, though not all, of its supported languages, but it doesn’t at present let you change voices, pitch or speed. Even so, if you’re handy with your thumbs and use a dictation facility you can make your mobile act as a Startrek-style translator to converse with someone in another language (rather haltingly).
The more capable text-to-speech apps like Vocality and Text Reader allow you to change voice (male, female, US, UK and so on), pitch and speed, but reading in regional accents, something I’d like to do, is currently beyond any of them. The first speech synthesiser I ever used in the early 1990s performed hardware modelling of the human vocal tract, and came with a very simple scripting scheme that let you markup text with ASCII tags to change the length of vowels or raise their pitch, but it never caught on. To do accents properly you’ll need to learn IPA, translate and edit your chosen words manually, then use an app that can pronounce IPA directly (which Google Translate can’t). IPA fonts are easily available, as are online services like ToPhonetics (https://tophonetics.com) that turn ASCII text into IPA and IPA Reader (https://ipa-reader.xyz) which can speak the resulting IPA. So, in principle I can alter texts to be spoken in regional accents, but it’s still a rather messy procedure split between several different apps.
That I might want to do it at all is in order to study isoglosses, boundaries between regions where different accents are spoken, an interest I share with one Ian McMillan from whose article (https://www.theguardian.com/commentisfree/2010/mar/21/language-derbyshire-barnsley-pronunciation-dialect) I learn that I was born right on the isogloss between South Yorkshire and North Derbyshire: “In Barnsley I call my house my house, but if I went to visit my cousins Ronald and Harry in north Derbyshire, they would meet me at the gate and invite me into their freshly wallpapered arse.” Whether or not there’s a duck in their bath seems rather irrelevant.
[Dick Pountain definitely lives in a ‘house’ in Norf Landan]
WOW, THAT’S SURREAL!
A FEELING FOR TRIANGLES
Dick Pountain /Idealog 342/ 05 Jan 2023 10:55
My column last month was a semi-temperate rant, triggered by the sensationalist reporting of advances in cosmology and particle physics by the mainstream press (the ‘wormhole in a quantum computer’ effect). I attributed this to the coming of age of successive generations reared on Star Wars and the Marvel Universe, which induces deep longings to flout the laws of physics.
Readers of a philosophical bent (assuming I have any) might have concluded from this column that I’m a red-faced, harrumphing old British Empiricist who lumbers around the world kicking things and shouting “I refute it thus!” but nothing could be further from the truth. By ‘further’ I mean that my philosophical views are 180° opposed to such an empiricism, if you believed that truth is organised as a 2-dimensional graph which I don’t. Instead I believe that all sentient living creatures, human beings included, are ruled by emotions and live in a world that’s constructed almost entirely by imagination.
There is of course a catch, and that is that those of us who think this way define the words ‘emotion’ and ‘imagination’ in a way rather different from, and more rigorous than, their use in everyday speech. I’ve suggested in this column several times before that emotions properly understood are evolutionarily ancient neural subsystems that alert us to dangers, attract us to food and to potential mates, persuade us to play or to freak out. They operate below consciousness but the chemical changes they produce in our bodies get detected by our senses as what we should more properly call ‘feelings’.
And speaking of our senses, another confusion arises there. All living creatures must separate what is themself from what is the outside world by a barrier, in our case the skin. Of our five sensory subsystems, only taste and smell permit molecules from the outside world to cross this barrier: touch measures pressure and temperature on the skin while sight and hearing detect waves of visible radiation and air pressure. Having our skin penetrated by anything more solid usually constitutes an undesirable emergency, so our experience of the outside world is mostly via second-hand, internal, signals.
Our brain processes such signals to detect patterns in them, analysing those into smaller sub-patterns and storing them for future reference: whenever a new stimulus arrives the brain tries to recognise it by rebuilding an image from such stored components. We maintain an internal mental map of the outside world which is neither complete nor entirely accurate, constantly updated via Bayesian composition of new inputs, and which most importantly is coloured by emotional tags: we like or dislike the things and places it contains. The world we actually live in is an imaginary one, precarious and error prone but which evolution has honed to be good enough to keep us alive. People born with a desire to pet Bengal Tigers tended to have fewer offspring than those who preferred to run away (or invent the rifle).
Part of this process should remind you a little bit of the multi-layer ‘neural’ nets used in the most successful AI deep-learning systems, namely the part about analysing and storing sub-patterns, but that’s as far as the resemblance goes. Computers don’t have bodies, nor emotions to protect those bodies, and most (not all) AI researchers remain staunch Cartesian rationalists who believe “I think, therefore I am”, when “I am, therefore I think occasionally” is closer to how humans really are.
All those stored sub-patterns occupy a universe of the imaginary, not material things but possible ways material things could be arranged, unchangeable and infinite in number. A triangle is a triangle is a triangle and you can imagine or discover any shape, size and number of them. Seeing a real brown cow uses most of the same patterns as imagining a blue cow. We think by manipulating and recombining patterns, we speak and write by recognising and producing them, mathematicians study the rules they obey. Claude Shannon’s Information Theory, the foundation of our industry, is about transmitting them from one place to another.
The point is such patterns have no power of their own to affect material things in the real world: that they can only do via our body and its muscles, by making them want to do something. Ever since Homo sapiens developed language it’s been inevitable that we would start confusing such patterns with things and wishing that they could do stuff directly. We developed science in order to understand why that doesn’t work, but not everyone wants to be so disabused, and some people make a living by exploiting and furthering such confusion. I’m perfectly happy to imagine blue cows, perhaps to write stories or make paintings of them, but I won’t try milking them…
[Dick Pountain likes Dr Johnson really]
POD PEOPLE
Dick Pountain /Idealog 366/ 05 Jan 2025 03:05 It’s January, when columnists feel obliged to reflect on the past year and who am I to refuse,...
-
Dick Pountain/Idealog 277/05 August 2017 11:05 I'm not overly prone to hero-worship, but that's not to say that I don't have a...
-
Dick Pountain /Idealog 360/ 07 Jul 2024 11:12 Astute readers <aside class=”smarm”> which of course to me means all of you </aside...
-
Dick Pountain /Idealog 363/ 05 Oct 2024 03:05 When I’m not writing this column, which let’s face it is most of the time, I perform a variety...