Friday, 18 April 2025

ARTY FACTS

Dick Pountain /Idealog 363/ 05 Oct 2024 03:05

When I’m not writing this column, which let’s face it is most of the time, I perform a variety of activities to keep me amused. Apart from walking, cooking, reading, listening to music (live and recorded), playing and fettling guitars, I have a couple of computer-based ‘hobbies’, namely making computer-generated music and computer-generated pictures (non-moving). I recently restarted work on my Python-based computer composition system Algorhythmics – which I described here back in issue 306 – after a four-year rest. What prompted me was viewing a YouTube video about the Indian mathematician D. R. Kaprekar and some interesting numbers he discovered, so I set-up Algorythmics to compose a short piano sonata around two of his numbers, and it sounds like a fairly pleasing mashup of Ravel, Janáček and Satie.

I’ve been documenting my efforts at visual art in this column for over 20 years as I marched through successive generations of paint software from Paintshop Pro to Photoshop Elements to Sumo Paint to Artflow. I loved art at school and can use both paint and pencil reasonably well, but I’ve never been tempted to use either seriously since I discovered the computer (any more than I’m tempted to write articles with a quill pen since I discovered the word-processor). The crux is editability: once you discover the infinite flexibility of digital imagery it’s hard to give up this ability to experiment, redo and correct without wasting paper, canvas and paint. Images on a screen certainly lack the texture of proper paintings, but then I’m strictly an amateur with no realistic ambition to make a living selling my work.

I’ve also been into photography ever since the 1960s and my computer explorations began by processing snaps to make them look like paintings, which taught me how to use layers and blend-modes to take total control of both the colour and tonal content. Later I began using this knowledge to create purely abstract images.

It won’t have escaped anyone in the habit of watching Instagram, Facebook or YouTube reels that there’s a remarkable revival of abstract painting going on right now among the social-media savvy younger generations. Unlike me they’re not working in the digital domain but rather in the messy and expensive domain of wet paint. Under various labels like fluidart, spinart and poured art they’re making dynamic action paintings – Jackson Pollock style – by pouring multicoloured acrylic paints onto a canvas, manipulating it using palette knives and then spinning and tipping it to produce a finished image. Often they incorporate silicone-based additives that introduce cell-like patterns of bubbles, and the results have a very particular biological look. Some of them are very attractive indeed and nearly all are highly decorative. I doubt that many of these folk consider their products to be high art but they are highly saleable, and that’s on top of any revenues that derive from successful social media hits, which is just as well as the costs in canvas and acrylic paint must be considerable.       

I'm very keen myself on early 20th-century modernist abstract art, especially Vassily Kandinsky, Paul Klee and Sonia Delaunay. I don’t set out to imitate their works but merely play around using digital processing on a starting image, which could be a photograph, a clip from a website or a digital image that I draw by hand. Mostly I just use a mouse nowadays (I’ve had several Wacom tablets in the past) since my starters are so simple. Another way I sometimes start is by using Zen Brush 3 on my Samsung Galaxy Tab, a delightful finger-painting program with extraordinarily realistic watercolour bleeding effects, and then send the result to my Chromebook via Android’s Nearby Share. Then I spend some time layering, blending, smudging and slicing until I see something I like, which does indeed tend to mean something that reminds me of one of my modernist mentors. 

I’m not at all tempted to go in for fluidart myself to make money, even though those attractive canvases are more readily marketable than digital prints. That’s not only because I don’t have a garage in which I can splatter paint up the walls, but also because whenever I watch these artists at work on Facebook, more often than not I find the earliest part of a new work the most pleasing but then they keep adding too many colours and over-do it. So, while watching one particularly spectacular piece I hit ‘Watch Again’, then hit || to stop it at an early stage I liked better, took a screenshot and used that as a starter for my own piece! This potentially raises a novel legal issue about copyright and plagiarism: I froze a moment in time that didn’t make it into the final painting, so was I really stealing…. 

[You can see six of Dick Pountain’s abstracts at https://www.facebook.com/dick.pountain/posts/pfbid02UBtGRbAU7aTLSPYjeyTebxJUjMJ6EME6cKd5iqYBsYcdbaPCPrUNxZNqJhE48rSKl and hear his Kaprekar tune at https://soundcloud.com/dick-pountain/kaprekar-sonatina]


  

SMELL U LATER

Dick Pountain /Idealog 362/ 05 Sep 2024 11:5

I have no qualms in claiming that I have a better (or least better trained) sense of smell than the average citizen. That’s partly, maybe mostly, because I studied organic chemistry in the 1960s. During my first few weeks of working in the cavernous Victorian college lab I was instructed to learn the odours of a dozen commonly encountered chemicals, and advised to employ smell as the first step in recognising any new compound. I can often tell an aldehyde from a ketone by sniffing, and became briefly addicted to ionone, cinnamaldehyde and menthol in succession, carrying little specimen tubes in my pocket. I’m sure this method is no longer taught, on health-and-safety grounds, as there are many substances nowadays that can kill at one sniff. 

In later life this training came in handy when my brother-in-law Pip founded The Scotch Malt Whisky Society and was writing a book that needed to categorise the nose of various famous spirits. Of course odour is now a huge business, not merely for perfumes as it has been for centuries but for those hundreds of flavourings contained in most supermarket foods which are manufactured in a huge chemical works in New Jersey. But smell has barely impinged upon the computer business so far, apart from the smell of burning insulation which most of us quickly learn to recognise (and investigate…)

I wrote semi-humorously in an earlier column about the possibility of a ‘sminter’ loaded with an assortment of smelly ‘inks’ that could be triggered via internet messaging, and I even got a letter some years later about an (unsuccessful) attempt at one. Even Hollywood attempted a brief stab at a Smell-o-Vision movie (‘Scent of Mystery’  by Mike Todd Jr.,1960 in case you’re interested) but the obstacles in both cases were the same, that smell is a chemical, not electronic, signal that moves at the speed of breeze rather than light – and you can’t just switch it off quickly either…

But a far more serious obstacle is that while the components of human light perception are threefold – red, green and blue retinal cells – the components of smell perception are vastly more numerous. Our noses contain at least 400 different chemical receptors and individual smells are recognised by trillions of combinations their outputs, which release a plethora of proteins that are still not entirely understood.  

But when you hear the word trillions nowadays, you’re usually either talking about a GPT (or at least about NVidia’s market cap). Understanding smell perception, like protein folding and DNA sequencing, is a perfect candidate for AI to analyse, so it comes as no surprise to learn (via an article in Nature https://www.nature.com/articles/d41586-024-02833-4) that many teams are working toward this end, with ample financing from industries. 

The problem has several aspects, which include: predicting the smell of a molecule from its structure; decoding the output of human odour sensors for particular compounds; and automating comparison of smells of different mixtures by identifying their components. The current hot variant of AI – the Generative Pre-trained Transformer (GPT) – works using the mathematics of parameter spaces: identify the important parameters of the subject to be analysed, apply tensor calculus to create a multidimensional space with a dimension for each parameter, and then map training examples into this space. For graphical AIs like Stable Diffusion and MidJourney such spaces already have trillions of parameters for identifying shapes in visual worlds.  

One immediate problem for applying this to smell is getting training data: odour receptors, whether human or animal, are hard to study, often won’t work outside the creature, fragile and the amount of protein released is minuscule. Two receptors from insects and two more from mice have been deciphered in the last year, leaving just 400+ more to go. A team at Duke University in North Carolina is using AlphaFold and machine learning to screen millions of chemicals for binding to two synthetic receptors they’ve engineered. A very important motivation for such work is to use smell recognition in diagnostic medicine by identifying odour molecules produced by disease processes (dogs are doing this already). Precisely how and where odour nerve signals are processed in the brain is perhaps the leading-edge study right now. 

Real progress is being made and AI may soon speed it enormously, but smell remains the least understood of our senses, and least amenable to digital manipulation. It’s so subjective that human tasters and perfumiers will retain an advantage over automated solutions for far longer than most professions, and I don’t expect to have to consult my laptop when I’m mixing our own custom bath oil from my little box of tubes of neroli, rose, ylang-ylang and sandalwood oils (plus several other secret ingredients).  

[Dick Pountain believes that a rose by any other name would smell mostly of β-phenylethanol] 

ARTYFICIAL INTELLIGENCE

Dick Pountain /Idealog 361/ 08 Aug 2024 01:04

Who’d have thought that AI could become so boring so quickly? I feel a rather urgent obligation to devote a column to what’s been happening, but it appears to be happening faster than I can type. Last month NVidia’s stock rose faster, and then fell faster than a SpaceX test shot. Then Amazon reports that its Kindle self-publishing platform has been flooded by such a torrent of AI-generated bodice-rippers that it’s having to impose a limit of only three books a day on its customers.

Because Amazon stops short of banning AI-generated content altogether, ChatGPT is rapidly becoming a cancer on the body of the publishing industry in more ways than just crap e-novels. When I’m reviewing a book I often look online at other people’s reviews, not to plagiarise them but to see other opinions. Recently though I’ve witnessed spammers using ChatGPT to spatter the net with worthless AI-generated paraphrases and summaries of best-selling real books, which make it almost impossible to find any proper critiques. And it’s not only words. The hand-crafted goods site Etsy recently told The Atlantic Magazine that it’s being swamped by AI-generated tee-shirts, mugs and other merchandise which employ ChatGPT to optimise their Google search rankings and crowd out real producers.    

In a previous column I was worried about abuse of AI deep-faked photographs to compromise political opponents, but that’s turned out to have been somewhat wide of the mark because it requires a certain political seriousness on the part of the perpetrators. What’s happened instead is that Midjourney and its ilk are now enablers of pure fantasy and pop-surrealism. They allow everyone and anyone to produce memes and professional looking posters that make merely spray-painting slogans on a wall feel like something from a previous century. 

When I first tried out Stable Diffusion a year or so ago I was amused by the way its limitations generated such hilariously surreal images, but I’m not laughing now. The recent UK wave of far-right activity against immigrants and asylum seekers has been organised online via Telegraph, The-Platform-Which-Used-To-Be-Called-Twitter, TikTok and other social media, and an important part of this rallying process is a new genre of surreal nationalistic propaganda memes. Popular content components of such productions are squadrons of Spitfires, St George Cross flags, knight-crusader figures in mediaeval armour and British Lions (often wearing tee-shirts but no trousers and playing cricket), all meant as symbols of that old Britain they believe has been stolen from us. GenAI tools enable them to churn out infinite combinations of these icons in glorious Marvel-comic colour and for minimal effort. It’s worse still in the USA where these same tools are being used to depict Donald Trump with a six-pack and Hulk-like musculature, occasionally with the golden wings of an archangel and a blazing sword. Visual satire has a long history from Cruikshank, Rowlandson and Gillray, to Georg Grosz, Otto Dix and Ralph Steadman, but it always required a graphical skill that has now been entirely eliminated, and the purpose is no longer satire but adulation.

It feels as though we’re currently in the ‘phony war’ phase of a marathon battle between states and AI companies over regulation of the internet. The AI side continues to bluster about transforming the world’s economy with soon-to-be invented AGI, while also admitting that it will need to use about half of the world’s electricity supply to do so (and investing in fusion research). The state was until very recently clueless about the threat posed by widely available generative AI tools – though the wave of violence in the UK seems to have awakened them smartly – and they’re also quite chary about imposing content regulation, over quite legitimate concerns about freedom of speech. The owners of online content – publishers, television and film companies – form a third force standing on the sidelines watching the impending battle. They’re furious that the AI companies have already scraped a sizable chunk of their properties without payment, but also acutely aware that there might be a profit opportunity here somewhere (who wouldn’t like robot authors and actors that you don’t need to pay?). 

As for us poor authors, artists, actors and other creators we can only just watch aghast, while some members of the general public perhaps see possibilities to gain quick entrance to the so-called creative world without the bother of arduously learning a skill (hence all those Amazon three-a-day novels). How this will all pan out is beyond anyone’s (even GPT 4.5’s) ability to predict. Too many variables, like who becomes US president in November, and too many hero/villains like Trump, Musk and Altman with hidden and volatile agendas. My guess is that the stock market might just call a halt to hostilities, and quite soon…              

[Dick Pountain only uses ChatGPT as a party-trick to horrify arty friends]


Wednesday, 5 February 2025

FULL CIRCLE

Dick Pountain /Idealog 360/ 07 Jul 2024 11:12

Astute readers <aside class=”smarm”> which of course to me means all of you </aside> will have noticed from various other features that this is the 30th Anniversary Issue of PC Pro, and since this is a monthly magazine, and since there are 12 months in a year, and since this is Idealog 360, the corollary is that I’ve been writing here since the beginning. I used the word ‘corollary’ there because it suggests a mathematical proof, and that is a ham-fisted way of introducing my theme for this month, which is mathematics. 

360 is a special number to me not simply because it represents 30 years, but because when expressed as an angle in degrees it represents a full circle, a return to the beginning. Another way to look at a full circle is in radians as an angle of 2π which I find more congenial because π is an irrational, even transcendental, number and I like to think of this column as being sometimes irrational and occasionally even transcendental (which you astute readers may have noticed). 

What I’m tiptoeing around here (in this nauseatingly arch manner) is a confession, namely  that I’m only posing as a computer nerd, that I’m actually a mathematician manqué, a math sheep in hacker/wolf’s clothing. At school, way back in the early 1960s, maths was my top subject in which I got a distinction at S level. I had to choose between reading chemistry or maths at uni but was seduced into the former by the lure of stinks and bangs over pencil and paper. My introduction to computing did come very early, in 1962, as part of a school team who built a prize-winning computer out of ex-RAF radar set parts, but that computer was analog, not digital, and all it could do was solve sixth-order differential equations and display the result as green squiggles on a cathode ray tube (which only real maths nerds could appreciate).

Math-nerdship never left me even once I discovered ‘real’ digital computers. At college I only ‘used’ London University’s Atlas to process the statistics for my biochemistry experiments. After Dennis Publishing (or H. Bunch Associates as we were called then) bought Personal Computer World in 1979, as the only maths nerd in the room I was delegated to take home a Commodore PET and learn how to program. I discovered that I loved it, but math-nerdship continued to steer my journey because after Basic I learned Pascal, Forth and Lisp, rather than C which would have been the obvious choice were I to want to make a living from coding (which I didn’t and don’t).   

Elsewhere in this issue you’ll find our nominations for the most important milestones in computing over the last 30 years, so rather than recap those here I’ll instead name a few of my favourite milestones in computer-related math-nerdship. Thanks to the internet everything is computer-related now, so I follow developments in maths through YouTube videos, Wikipedia articles, Royal Institution and TED talks, but most of all through the excellent, non-profit, Pulitzer-Prize-winning online magazine Quanta. 

Launched in 2012 to promote public understanding of mathematics, theoretical physics, theoretical computer science and the basic life sciences, Quanta is funded by, but editorially independent of, the Simons Foundation. James Simons is a mathematician, educated at MIT and Berkeley, who started out working on pattern recognition, string and quantum field theory, then went to Wall Street and used his maths as a  ‘quant’ investor to become the 51st richest person in the world.

My favourite recent Quanta pieces have been by Philip Ball on [The New Math of How Large-Scale Order Emerges | Quanta Magazine] and one on Dedekind [How the Square Root of 2 Became a Number | Quanta Magazine]. Number theory is the part of maths that still entrances me. Irrational and transcendental numbers like π have infinitely many, non-repeating digits after their decimal point which makes them a little awkward to handle. Dedekind found a stunningly elegant way to pin them down, by splitting the number line into everything below and everything above the one you want. 

I still write programs – in QPython on my Chromebook – though nowadays they’re almost always about maths, playing with palindromic numbers or fiddling hopelessly with the Collatz Conjecture, or just solving a puzzle from Quanta magazine. I watch tons of YouTube videos that use clever visualisation tricks to explain p-adic numbers and their relation to the Riemann Hypothesis. The great thing about maths is that it doesn’t require a lot of apparatus, just a brain plus some sand and a stick, chalk and a blackboard or pencil and paper (or Python and a Chromebook). And there’s always a chance of being that amateur who makes a significant discovery…   

[ Dick Pountain is quite satisfied with his slice of the π ]

AI EVERYWHERE?

Dick Pountain /Idealog 359/ 11 Jun 2024 09:48


A few weeks ago I  'attended' an interesting webinar organised by IT security firm Sophos in which one of their researchers, Ben Gelman, demonstrated how someone with no knowledge of programming or web design can construct a convincing e-commerce site complete with product pictures,  audio, even video for around £4.23 in 8 minutes. He did this live on screen using GPT4, Stable Diffusion and a couple of other free tools.Then he dropped the bombshell: by  adding just a tiny knowledge of HTML, a couple of lines,  he created from this an agent-based framework capable of churning out hundreds or thousands of similar sites from simple text templates.

The web will soon  be swamped by such sites since  it takes a lot longer than 8 minutes to take one down, or they may even disappear themselves once a quota of suckers have been hooked. It’s happening already, we’ve recently been ‘fleeced’ by items of clothing and pottery that bear little resemblance to AI-augmented pictures on certain sites.

Last week we went to a restaurant we’ve enjoyed for years and suffered a revolting, shoddily prepared meal. Went to Tripadvisor and found hundreds of still gushing five-star reviews, so on a hunch I filtered for one- and two-star reviews and found a few dozen accurate and mostly temperate complaints about drastic decline (‘What have they done to….’). Do all those five-stars mean most customers have no taste buds? Or were they paid for? Or are they AI generated? The fact I even ask that question speaks volumes…. Once it becomes known that you can use AI to prop-up dodgy businesses  the technology has arrived in the mainstream. 

On a completely different note, Apple just announced its deal with OpenAI to integrate ChatGPT into its operating system on all  devices including Siri. Add to this similar projects from Meta and Google and it’s clear that it won’t be long before ‘AI-query’ becomes a commodity service on the same level as internet access and telecommunications, with transactions in the billions and trillions. Ideally they’d all merge into a single ‘information utility’ but of course that ideal is quite impossible to realise for several very serious reasons. 

Reason one is intellectual property. All those streams of content are owned by different, competing corporations whose only rationale is profit rather than public education. That hare has started running already with Scarlett Johansson’s suit against OpenAI for illicit use of her voice.Reason two is that even were the  AI vendors to get all necessary permissions to use other people’s content to train their GPT models, that content is going to become polluted at an exponentially-increasing rate by the gibberings of billions of dodgy websites created by their own customers.    

Reason three is the killer though. We’re all now aware of the colossal amounts of compute power needed to train and deploy GPT systems. Given present technology it’s quite impossible to train or fully query such a system on your local computer/phone/tablet (at the so-called ‘edge’) so these services will remain mostly cloud-based for the foreseeable future. Advances in analog-based AI processors and similar technologies can reduce telecommunication bandwidth requirements to some extent by more ‘edge processing’, but cloud AI servers will still consume as much of that old-fashioned utility, electricity, as a medium-sized African nation.

This is all happening during a world-threatening climate crisis which most sane commentators agree requires us to find cleaner ways to generate electricity and equally importantly to use far less of it. AI companies are already starting to worry about  where all that electricity is going to come from and the Wall Street Journal recently reported that OpenAI is in talks to buy “vast quantities” of energy from the startup nuclear fusion company Helion, in which CEO Sam Altman has invested $375 million. Fusion power occupies an ontological niche rather like that of quantum computing, somewhere between hope and reality, real-soon-now-perhaps… 

And then there’s the question of how to pay for all this juice, which leads into the realm of blogger Cory Doctorow’s concept of ‘enshittification’. He summarises his caustic take on tech evolution thus:

“Here is how platforms die: First, they are good to their users; then they abuse their users to make things better for their business customers; finally, they abuse those business customers to claw back all the value for themselves. Then, they die.”

Read his argument at The ‘Enshittification’ of TikTok. Corporations like Apple and Amazon didn’t spend the original AI research money and hence have to pay for it now, by buying the AI companies or by paying some kind of rent. The cost is so substantial they must get it back from their customers. The days of freebie services are numbered. 

[ Dick Pountain thinks it will all end in tears] 



ARTY FACTS

Dick Pountain /Idealog 363/ 05 Oct 2024 03:05 When I’m not writing this column, which let’s face it is most of the time, I perform a variety...