Thursday, 2 April 2026

AUDIO DENTIST

Dick Pountain /Idealog 375/ 5th October 2025 : 09:53am 

I love music and I love soldering. Here is a story that has both. I’ve explained my opinions about hi-fi here several times: I like good sound quality but I’m not a hi-fi nut, don’t buy oxygen-free cables or onyx cartridges or gold-plated anything (and definitely didn’t colour-in the edges of my CDs with a green marker pen). I won’t spend thousands on any component, and I wrote here how my listening was transformed by connecting a cheap Fosi Class 4 Bluetooth amplifier to my vintage Castle speakers (legacy items from when Dennis Publishing used to publish Hi-Fi Choice magazine). 

All sound from my Chromebook Plus gets routed via Bluetooth through these speakers and I remain delighted by its quality, but a problem arose. I have to attend regular Zoom committee meetings, where we discovered that my Bluetooth remote arrangement was causing irritating echo effects for the other members, so I decided to resort to headphones. I purchased a set of moderately-priced JBL wireless headphones and that solved it, but it also created a slight annoyance for me because the dreaded Bluetooth wouldn’t automatically switch itself, so I had to manually disconnect the Fosi amp in Settings and select the JBLs.

Now with my other, sociological, hat on I’m a keen observer of the doings of GenZ youth, which recently include discovering a preference for vintage digital compact cameras over smartphones and for wired headphones over wireless ones. Aha I thought, in my Santa’s-Grotto-of-retired-digital-artefacts upstairs (covered in a recent column) there must be some of those, and there were indeed two – Sony MDR V100s and Sennheiser HD 201s that I hadn’t used for decades. I tried them, both worked and sounded surprisingly good, but the Sennheisers sounded terrific. So terrific I decided to do the pseudo-scientific thing, an A/B/C comparison with the Sony and Bluetooth JBLs. The result shocked me as I far preferred the Sennheisers. The JBLs were louder and had more bass, but somehow they, and the Sony, were less ‘engaging’. 

Engagement is difficult to discuss without descending into woo-woo. The Sennheisers are very light (plastic and aluminium pressings), oval in shape so they completely encase my ears rather than resting on them. Their sound-stage is better balanced, with bass that’s not so pronounced but ‘right’. (As an aside, I personally judge sound reproduction by just two instruments, acoustic piano and double bass: for my A/B/C test I used the tracks ‘Mademoiselle Mabry’ by Miles Davis and Janacek’s ‘On An Overgrown Path’ played by Josef Páleníček). The Sennheisers induced what we used to call a ‘drugless trip’, where you engage so far that you feel there.  

Problem was, those Sennheisers were made in 2005, the era of plastic-from-hell that turns into chewing gum with age. The outer insulation on their far-too-long cable was rotting and peeling, to reveal not the expected red or black plastic-covered conductors but a gleam of bare copper! I tried insulating tape, Gorilla tape, even silicon sealer, but ended up with an ugly, sticky mess. Wanting them so much emboldened me to replace the cable myself. Gemini taught me that those inner conductors are called ‘tinsel wire’, microscopically thin strips of copper coated with an insulating lacquer that you can’t just scrape off, spirally wound like guitar-strings around a central textile cord. Soldering them is a real challenge and reader, I took it! 

From Amazon I bought two short cables with female RCA plugs going to bare wire, along with a longer cable from 3.5mm jack to twin male RCA plugs. Snipped off the old cable, took out four tiny screws to disassemble each earpiece, only to find a single black-box holding all the gubbins from which emerge two horse-hair-thin tinsel wires lacquered red and blue. YouTube explained how you create of large blob of molten solder and plunge the wire into it: if the temperature is just right that burns off the lacquer (with a puff of smoke), adheres to and tins the copper; if the temperature is too high it burns off the copper too; too low and it doesn’t adhere. I chopped up the old cable to practice and practice, only suffering one painful burn, and then did the deed. 

Reassembling and finding they worked, I punched the air in triumph. Completed the job by discovering that my Chromebook’s two USB-C ports support high-def audio. Back to Amazon for a UGREEN adapter from USB-C to 3.5mm jack that contains a 32bit/384kHz DAC giving more volume and (perhaps) slightly better definition than the audio jack socket. The signal path from recording studio to my eardrums is long and convoluted, via TCP/IP, Wi-Fi and various different AC-DAC-and-backs, but at least I’ve managed to extract one irritating Bluetooth…

[Dick Pountain doesn’t have ‘golden ears’ but he does have dry earwax]

 


 


  


  

LOVE AND HAIDT

Dick Pountain /Idealog 374/ 3rd September 2025 : 10:36am 

It becomes harder and harder to scrabble grains of online pleasure, amusement or edification, but it remains possible (just) on YouTube. I particularly enjoy two grizzled performers, Rick Beato (whose interviews of musicians like Rick Rubin, Guthrie Trapp and Tom Bukovac are priceless) and Jon Stewart, satirical political commentator whose late night Daily Show kept many of us in stitches during the GW Bush presidency. Stewart attempted to retire in 2015 but he’s back, presumably lured by the grim shenanigans in the White House, presenting a new YT version of the Daily Show on Mondays, but on Thursdays an in-depth podcast called The Weekly Show, where a recent guest was the social psychologist Jonathan Haidt.

Haidt is currently controversial for his best-selling book that alleges that smartphone overuse is damaging young peoples’ mental health, but I know and admire his work from reviewing an earlier book called ‘The Righteous Mind’ (2012), an experimental study of the way peoples’ moral outlook affects their political behaviour. Haidt is a leading light of the ‘Intuitionist’ school of psychology which holds that not all our behaviour is rational, and in particular moral judgements like disgust are hard-wired to bypass the reasoning parts of the brain. (His thought-experiments to test this are highly amusing but unsuitable for a family magazine like this, involving incest and molesting chicken dinners). The reason I raise his work in this column is another book I just reviewed, Karen Hao’s ‘Empire Of AI’, an inside glimpse into the rise of OpenAI and ChatGPT. 

Hao documents three important facts about the company: unanimous agreement that AGI (Artificial General Intelligence) is possible and the only worthwhile goal; belief that AGI will be achieved by endless ‘scaling’, cramming tens, thousands, millions of Nvidia GPUs into their servers; and a split, right from the very start, between those who think AGI will be great and those who think it will be deadly (many of whom left to set up Anthropic and Claude). Now I’ve stated here several times that I believe AGI is neither desirable nor possible, for reasons that depend upon the work of Haidt among others. If it’s not achievable we won’t face the worst of many imaginary harms like enslavement by robots, but it makes the current monomaniacal hyperscaling futile, dangerous and horribly wasteful. 

Impressive, amusing and addictive as current LLMs and GPTs are, they fall far short of general intelligence because they’re not alive. Unlike Nvidia chips, living beings need food, safety and to reproduce themselves, and these imperatives structure our thought and behaviour profoundly. Billions of years of evolution equipped us with ‘emotions’, chemical computational sub-systems that detect and seek to satisfy needs. The US/Portuguese neuroscientist Antonio Damasio postulates that when we store a memory of an event it somehow gets imprinted 

with the emotional/hormonal state at the time (via biochemistry that is barely yet understood). When we retrieve it later to help understand some future event, these emotional markers act as weights (like parameters in an LLM) and contribute to the outcome of our decision. So images and words can never be entirely neutral, they carry subconscious emotional connotations of varying strengths. AI models lack needs and fragile bodies, and hence purpose. Actually smartphones, which can ‘see’ and ‘hear’, know their location and orientation and can travel the world in our pockets (so long as we remember to charge them) are way closer to human experience than ChatGPT is. Equipping one of those highly-capable Boston Dynamics robots with a fully autonomous AGI must remain science fiction so long as GPTs require aircraft-hangar-sized supercomputers and consume megawatts of electricity. Our own bodies have a mitochondrial ‘battery’ in every cell, enabling us to think and/or reproduce ourselves on around 2000 calories a day…

Cognitive psychologists and economists like Haidt and Kahneman have revealed that emotional modes of intuitive thought aren’t reducible either to symbolic logic or Turing computability, and that these mechanisms drive attraction and enmity, friendship or bias and prejudice. They underpin crucial affective human virtues like empathy, wisdom, justice, courage, honesty, compassion and generosity without which any aspiring AGI would merely be a sociopathic silicon solipsist. And most importantly, intuition is vital for creative reasoning, causing those unprecedented leaps between vastly differing conceptual spaces that make up the mind of a Newton, a Mendeleev or an Einstein. The training data for connectionist AI models contains only representations of mental states – text and pictures scraped from the internet – and what emotional weight it does aggregate is mostly bad news, a swamp of hateful and obscene human communication that costs the AI corporations big bucks to hire human beings to painstakingly disinfect, in procedures they call ‘alignment’ and RLHF (reinforcement learning from human feedback)...

[Dick Pountain likes ChatGPT, but in a purely platonic way]


 






Friday, 6 February 2026

COLLECTABLES

Dick Pountain /Idealog 373/ 08 Aug 2025 10:52  

I don’t really have the collectors’ instinct. When I was a kid my father was a serious stamp collector and I briefly made a feeble effort to be one too. I was slightly more interested in my album of labels from exotic canned goods, but that petered out pretty soon too. I find that in adulthood I’ve accumulated nine guitars but each of those was bought to play, then superseded but not sold, so it doesn’t really count as a collection. I have owned ten motorcycles over sixty or so years, but only ever one at a time. Books don’t count: I started accumulating those as a student and continued as a book reviewer, but all were obtained to read and never sold (there are around a thousand of them, none rare).

When I’m not writing about computers here I review books for a political journal, mainly ones about political economy and sociology. An author who had a big influence on me was the French sociologist Pierre Bourdieu, whose best-known book ‘Distinction: A Social Critique of the Judgment of Taste’ examined taste as an act of social status building, drawing on huge amounts of data gleaned from quantitative surveys, photographs and interviews. Two former associates of his, Luc Boltanski and Arnaud Esquerre in 2014 published a paper called ‘The Economic Life of Things: Commodities, Collectibles, Assets’ which is the best, most interesting account of collectability I’ve seen. B&E describe the way that all the things we manufacture, purchase and use pass through three phases of ownership, which they label the ‘standard form’, the ‘collectible form’ and the ‘asset form’. Consider for example a lemonade bottle from before WW1 with one of those stoppers that’s a caged glass marble. A mass-produced item, made as cheaply as possible, reusable and returnable for a penny. It served its purpose of containing and dispensing lemonade, then eventually got thrown in the bin (standard form). Years later someone found it on a tip and it ended up in an antique shop, sold for £15 to a middle-class couple as a kitchen ornament (collectible form). Their friend was a famous film director and it became a key prop in a very successful movie, and ended up sold for £5000 in a sale of memorabilia at Christie’s (asset form). An ancient Roman olive-oil jar might follow a comparable trajectory, but with prices several orders of magnitude higher. A drawing done on a napkin to pay for supper by a famous painter ditto, but its asset form might be in millions. In the asset form, things are no longer used, may often not even be displayed but stored in a vault, a hedge against inflation or financial crisis, a store of value.

What has this to do with computers you may be wondering. Well nothing actually, and that’s the point. Computers, along with much of the rest of the merchandise of the digital world, seem to defy B&E’s classification scheme by being stuck forever in the standard form: they end up in a skip, then get ripped apart to recycle a few chips and some gold-plating. The aesthetic appearance and quality of workmanship of such goods is so low that very, very few people want to collect them, and what’s more those few who do face insurmountable problems in keeping them 

working, due to the rapid and haphazard evolution of firmware, software, ports and cables, storage media, the lack of effective documentation, and the rapid disappearance of smaller manufacturers prior to the monopoly era we now inhabit. 

If you detect a faint tinge of animus in that last paragraph you’re correct, it’s because I have a room upstairs full of digital junk accumulated over my 40+ years of computer journalism that I can’t get rid of (and which those nearest and dearest to me would love to see dumped in a skip to reclaim the room). I can’t bring myself to do that. Along with some quite notable historic hardware – first-gen IBM PC and Macintosh, Acorn Archimedes, Newbrain, Epson HX20 – there are shelves full of software both famous and obscure that I have a hunch I may perhaps be the only person to still have, given my privileged status as recipient of review materials. The early history of the UK personal computer scene is sitting up there, and no-one appears to want it. I’ve tried all the various computer museums people have recommended, and none are interested in collecting the lot (I no longer have a car). All the reasons I mentioned above render it enormously hard for them to get this stuff working, and once they do it’s hardly entertaining. I do feel though that someone ought to document this history before skips claim it all. 

[Dick Pountain will hold onto the draw containing every Psion Organiser]

 

TOO DARNED HOT

Dick Pountain /Idealog 372/ 07 Jul 2025 01:15

I’ve been watching the rebellious mood that’s growing among Microsoft Windows 11 users with a degree of (not very nice) complacent amusement, as someone who dumped Windows in favour of a Chromebook more than eight years ago. Actually my defection was as much an accident as an example of prescient wisdom. When Dennis Publishing moved to new offices, then CEO James Tye took the quixotic decision to deploy Chromebooks to all, I took the opportunistic decision to borrow one and, being totally brassed-off with Windows 8.1, immediately became hooked. The Asus I bought for myself still works well and had been a source of great pleasure but for one problem – Google stopped supporting its version of ChromeOS with automatic updates about a year ago, and I started encountering apps that demanded an OS update I couldn’t procure. So I splashed out £229.99 on a new Asus CX3402 Chromebook Plus, which has a better screen, twice the memory, an 8-core Intel CPU and 10 years guaranteed updates. Migrating to the new machine was as arduous as usual: charge the battery, switch it on and wait 10 minutes for my online life to come down from the cloud. There were two extra chores though. Because in all matters digital I’m very far from being a trusting person, I also keep data I consider crucial on a local 128Gb memory stick that lives permanently in one USB port, so I had to unplug that and plug it into the new machine. Another quirk of mine is that I don’t really like touchpad cursor control and so use a Logitech Wireless Mouse whose dongle lives in another USB port.

I periodically backup the contents of the USB stick, a tiny metal-cased one from Integral, for which purpose I swap the Logitech dongle for another backup memory stick. A few weeks later I was doing such a backup when I noticed that the sticks had become very hot. Not warm but hot, hot enough to make me flinch on touching them, hot enough to worry. So I went online to the source of all wisdom which is Reddit, where I discovered that hundreds of people were reporting the same experience, not only with Chromebooks and with a variety of brands of stick, but in all cases when using them in USB-C ports. The consensus was that it’s OK, it’s because of the huge capacities of the current generation of sticks and the poor ventilation of the smaller cases. I pretended to believe that and to live with it for a few more months, until I started to get unreliable behaviour from the toasting stick. First it started unmounting at random times, though it always came back after unplugging and replacing. Then during a backup session, ‘copy failed’ messages and directories going missing from listings, at which point I panicked. I dug out my original old Asus where I confirmed that the contents of the stick were in fact intact, and that it didn’t get hot, and I carried out the backup there on the cool older USB ports. 

Something clearly had to be done because it’s become part of my work practice to keep this tiny, unknockoutable, USB stick permanently in place, and changing to some huge protruding one wasn’t acceptable. My first recourse was a heat-sink, cobbled together by wrapping the Integral stick tightly in aluminium cooking foil held in place with a 15mm binder clip with the handles detached. This worked, dissipating enough heat to reduce it to just warm to the touch, but it was too inelegant for me to live with. I couldn’t find any hard info online about which brands were most liable to overheat, but my own collection accumulated over the years revealed Sandisk, Patriot and Tab all got just as hot. As I was dolefully scrolling down the endless Amazon list of sticks, one from Samsung caught my eye because it looked nicer: the same shade of grey as my computer, short, fat and shiny. I ordered one and discovered that it’s just as fast, and barely gets warm… 

What moral to draw from this story I’m not really sure. I’m not a semiconductor engineer and can’t find an adequate explanation online from anyone who is, as to why such a huge discrepancy in performance exists between brands. Does the difference lie in the chips themselves, the design of the cases, the electrical interfaces or a problem in USB-C sockets? Such silence is deafening and disturbing when data loss is a distinct possibility. But of course the computer is becoming very much the poor cousin to the smartphone, for which such sticks are not relevant and for which professional standards of data hygiene are barely relevant either.    

[Dick Pountain still occasionally dreams that he’s trapped inside the Windows Registry]

Sunday, 30 November 2025

COLOR ME OLO

Dick Pountain /Idealog 371/ 10 Jun 2025 12:15

I’ve expressed my feelings about science fiction before many, many, probably too many, times in this column. A big fan in my 1960s teens, a bout of illness in the ‘70s let me binge all the greats – Vonnegut, Ballard, LeGuin, Dick, Pohl, Bester etc – overdosing so badly that I never wanted to read sci-fi again. Looking back now as emotionally-retarded pseudo-intellectual sci-fi fans appear to be taking over the world, I think perhaps my immune system was telling me something. However this allergic reaction doesn’t apply to the closely related genre of fantasy (or gothic, or cosmic) horror. I still can cringe a little to M.P. Shiel’s ‘The Purple Cloud’, William Hope Hodgson’s ‘The House on the Borderland’, or the entire oeuvre of H.P, Lovecraft. 

One of Lovecraft’s stories, ‘The Colour Out Of Space’, struck me particularly hard. A meteorite lands in a tiny community in the New England woods, containing globules of a weird colour that isn’t in the solar spectrum: it has unpleasant effects that consume all living animals, plants and humans and turn them into grey ash. Several unsuccessful attempts have been made to film this story, hard work given that it’s not in the technicolor spectrum either: perhaps the nearest anyone has come is Alex Garland’s 2018  ‘Annihilation’ which clearly shows Lovecraftian influence and employs a digitally-produced shimmer in place of a new colour. I suppose Lovecraft’s story is a sort of parable about environmental destruction, but that’s not what explains its hold on me – I’ve always been fascinated by colour, studying its chemistry, reading up on all the various systems, appreciating great paintings and creating my own digital art as a favourite hobby.   

All of which explains my enormous excitement on reading in the Science Adviser newsletter about new research at the University of California Berkeley, which ‘creates’ a new colour that lies outside the gamut humans can perceive, but can be seen by shining a laser onto ones’ retina (don’t try this at home kids). First some background is required. You already know that our eyes, and hence also digital display devices, perceive or present colours as mixtures of the three components red (R), green (G) and blue (B). That’s because the retinas of our eyes contain three types of colour-sensitive cell called ‘cones’, sensitive to different wavelengths: long (L), medium (M) and short (S). Objects illuminated by natural sunlight stimulate L, M and S cones to different extents which gives us the experience of different colours. Red light primarily stimulates L cones and blue light S cones, but since M cones respond to the middle of the range, overlapping with both L and S, there’s no component of sunlight that stimulates M alone. 

The authors of the Berkeley paper (https://www.science.org/doi/10.1126/sciadv.adu1052) set out to investigate a new system for describing colour perception, by using a laser to illuminate retinal cells one at a time: 

“We introduce a principle, Oz, for displaying color imagery: directly controlling the human eye’s photoreceptor activity via cell-by-cell light delivery. Theoretically, novel colors are possible through bypassing the constraints set by the cone spectral sensitivities and activating M cone cells exclusively. In practice, we confirm a partial expansion of colorspace toward that theoretical ideal. Attempting to activate M cones exclusively is shown to elicit a color beyond the natural human gamut.”

Calling their new system Oz was of course a trigger for me, having cut my journalistic teeth on that notorious hippie journal where my articles were printed in every colour of the rainbow on a background of every other colour of the rainbow. But I digress. Their new colour, from stimulating M cells alone with a laser, they named ‘olo’. It can’t be reproduced in paint, ink or screen, so you’ll only ever see it sitting in a dentists’ chair with a laser strapped to your head: 

“Subjects report that olo in our prototype system appears blue-green of unprecedented saturation, when viewed relative to a neutral gray background. Subjects find that they must desaturate olo by adding white light before they can achieve a color match with the closest monochromatic light.”

Thankfully olo has not so far shown any inclination to suck the life force out of living beings and reduce them to grey ash. The best approximation is a light turquoise, a colour you might glimpse fleetingly when watching a big wave break on a rocky headland in bright sunshine. Astronomers for a while believed the whole universe, by mixing all the light together, would be a light turquoise but that turned out to be a bug in their software, and it’s now ‘cosmic latte’, a light beige (hex triplet value in standard sRGB #FFF8E7). 


[Dick Pountain wonders whether Olo could become ‘the new Pistachio’]


GOING NEURO

Dick Pountain /Idealog 370/ 05 May 2025 01:31

I’ve written many, many sceptical words about AI in this column over the years, railing against overconfidence and hype, hubristic pursuit of AGI, deepfakery and content pillage, but nevertheless I do believe AI – once we’ve civilised it – is going to be hugely  important to science, economics, robotics, control systems, transport and everyday life itself. Given the political will, public concern about misinformation, invasion of privacy, and theft of artistic data can be regulated away, but there would remain one colossal stumbling block, namely energy consumption. 

When AI corporations consider purchasing mothballed nuclear reactors to power their compute-servers the absurdity of AI’s current direction ought to be visible to everyone. The current generation of GPT-based AI systems depend on supercomputers that can execute quintillions of simple tensor arithmetic operations per second to compare and combine multiple layers of vast matrices holding encoded parameters. Currently all this grunt is supplied using the same CMOS semiconductor process technologies that gave us the personal computer, the smartphone and especially the computer game – the Nvidia chips that drive most AI servers are descendants of ones originally developed for rendering real-time 3D games. The latest state-of-the-art GPUs have a watts/cm² power density around the same as an electric cooking hob, and the power consumption of AI server farms scales exponentially, as the square of the number employed (order O(N²) in the jargon of complexity theory). 

In their 1978 bible of the CMOS revolution ‘Introduction to VLSI Systems’, Mead and Conway devoted a final chapter to the thermodynamics of computation: we’ve long known that logic operations and memory accesses always consume energy, whether in silicon or in protein-and-salt-water like the human brain. However the human brain has far, far more neurons and synapses than even the largest current AI server farms have GPUs, yet consumes around 20 Watts as opposed to AI’s 50+ Megawatts. Understanding what’s responsible for this immense efficiency gap is crucial for creating a more sustainable next generation of AI, and the answer may lie in new architectures called ‘neuromorphic’ because they mimic biological neurons.

Individual metal-on-silicon-oxide transistors aren’t six orders of magnitude more power-hungry than biological neurons, so other factors must be responsible for the huge difference. One factor is that biological neurons are analog rather than digital, and another is that they act upon data in the same place that they store it. In contrast the CMOS GPUs in AI servers are examples of von Neumann architecture, with processing logic separated from memory, and program code from data. But the MOSFET transistors they’re made from are inherently analog, operated by varying voltages and currents, so the digital data they manipulate gets continuously converted back and forth between the domains, at great energy cost. 

Neuromorphic AI hardware designers try to bring data and processing closer together. Intel introduced its Loihi 2 research chip, with 128 neuromorphic cores and 33MB of on-chip SRAM, which communicates via trains of asynchronous voltage ‘spikes’ like those in biological neurons. Steve Furber (of ARM fame) works at Manchester University on a neuromorphic system called Spinnaker that has tens of thousands of nodes each with 18 ARM cores and memory, also using spike-based communication. These schemes do reduce data access overhead, but they remain digital devices, and to approach biological levels of energy economy will require a still more radical step into purely analog computation that exploits the physics of the chip material itself. 

The US firm Mythic’s AMP (Analog Matrix Processor) chip employs a 2D grid of tunable resistors whose values encode the weights for an AI model, whereupon it relies on Kirchoff’s Laws to in effect multiply-and-add the analog input voltages and perform convolutions. However AMP is still fabricated in CMOS. A more radical next step would be to implement this resistive analog computation using low-power ‘spintronic’ memristors – devices in which the orientation of magnetic spins represent bits as in modern hard disks. One way to implement non-volatile memristors is by FTJs (Ferroelectric Tunneling Junctions) formed by sandwiching nano-thin magnet/insulator/magnet layers which can be fabricated using existing semiconductor processing. These devices can be written to and switched cumulatively like real neurons, and read-out non-destructively using very little power. 

The Dutch physicist Johan Mentink used a recent Royal Institution lecture (https://youtu.be/VTKcsNrqdqA?si=ZRdxeyP4B-hfUw3X) to announce neuromorphic computing experiments in Holland that employ two-dimensional cross-bar grids of memristors, organised into a network of  ‘Stochastic Ising Machines’ that propagates waves of asynchronous random noise whose interference yields the spike trains that transmit information. The Dutch researchers claim such devices can potentially be scaled linearly with the number of synaptic connections, reducing power consumption by factors of 1000s. I love the idea of working with rather than against noise, which certainly feels like what our brains might be doing… 


[Dick Pountain’s neurons are more spiky than most]


Tuesday, 14 October 2025

INTERESTING TIMES?

Dick Pountain /Idealog 369/ 07 Apr 2025 12:06 

Some 26 years ago (May 1999/ Idealog 58) I opened this column thus: “It's said there's an ancient Chinese curse, "May you live in interesting times!" Actually it's been said by me at least twice before….”  

Well, it looks time to trot it out for the fourth time, though nowadays I can check its truth using my old friend ChatGPT: “No record of this phrase (or anything close to it) exists in classical Chinese Literature: the earliest known attribution is from British sources in the 20th Century, a 1936 speech by Sir Austen Chamberlain (brother of Neville) who claimed to have heard it as a Chinese curse but it seems more like a Western invention made to sound exotic or wise.”  So, I’ve been peddling fake news for a quarter of a century, but I don’t feel too guilty as everyone seems to be doing it. 

The financial turmoil created by President Trump’s tariff barrage purports to be against a whole world ripping off the USA through unfair trade, but we all know it’s really about China. Many of us believe Trump has made a monumental blunder that will ultimately help China to economic dominance: The Wall Street Journal expressed it thus “What a fabulous change in fortunes for the Chinese leader. Mr. Trump has taken an ax to the economic cords that were binding the rest of the world into an economic and strategic bloc to rival Beijing – and at precisely the moment many countries finally were starting to re-evaluate their economic relationships with China.” 

I’ve covered semiconductor fab, Taiwan invasions and DeepSeek in recent columns so let’s not go there again, except to guess that Trump’s shenanigans could cause interest rate, futures and bond market chaos that may bring down the intricate house-of-cards finance of the AI bubble corporations (already under siege from IP lawyers). Instead I’d rather talk about how the ‘interesting’ times are affecting my own everyday activities. 

For many years my online presence, apart from my own website, consisted of Flickr for posting photos and Facebook for chatting/arguing/posing with friends. No longer, as my online self has been shattered into a dozen fragments, none of which have quite the same scope or satisfaction as before. Facebook started to deteriorate for me a couple of years ago as old friends left and un-asked-for content increased, but since Zuck did a Musk on it by removing moderation it’s becoming intolerable, and as a result I’ve begun to work on building a following on both BlueSky and SubStack. 

BlueSky is full of left-leaning refugees from the steaming pit that X has become: lots of excellent, sympa content, in fact too much to read it all and unanimous enough to risk boredom. I joined SubStack years ago hoping to get paid for some of my stuff but that didn’t work out so I forgot it until now, when it appears to be changing into something different. It’s becoming a social platform to rival Facebook, an alternative refuge for X-iters and I actually find it more interesting than BlueSky, but with one huge reservation - its structure and user interface remain totally baffling. Is it a mailing list or a website, a forum or… what?  Do I add posts or notes, and where will the comments arrive? My efforts in computer-generated music are now scattered among a host of platforms including SoundCloud, YouTube, NoisePit, BandCamp and GrungeDump (I may have invented one of those) and it remains stubbornly antiviral on all of them. YouTube is still my main source of entertainment, from genial luthiers to hilarious espresso gurus, Rick Beato’s music interviews to Jon Stewart’s Weekly Show. I even watch some movies there cheaper than other paid platforms (recently found ‘Lonesome Dove’ for free). 

The ‘interestingness’ seems to be spreading from online matters to offline. In recent months my Chromebook finally ran out of support (bought a new Asus CX340, cheap, way faster and nicer). BT announced that it was killing off my analog landline early, meaning a new hub, and that my mobile account should be moved to EE. While trying to surf such unwelcome disruptions several websites started playing up – I became adept at navigating poorly-implemented two-factor authentication schemes that trap you into endless loops of passcode tennis, and discovered a new game called ‘hunt the human’ while traversing the maze of AI chatbots that firms now erect in the name of Help…

Shall I end on a cheerful note, that things can only get better? It’s getting ever harder to believe that. Once the DOGE-days are over, assuming some kind of sanity is restored, then the craven way the big Silicon Valley corporations crowded onto Trump’s rattling gravy-train will haunt and taint them for years to come. 

[ Dick Pountain pronounces DOGE as ‘doggie’, like that creepy Shiba Inu dog meme] 






 



AUDIO DENTIST

Dick Pountain /Idealog 375/ 5th October 2025 : 09:53am  I love music and I love soldering. Here is a story that has both. I’ve explained my ...