Friday, 23 July 2021

BOOKISH MATTERS

Dick Pountain/Idealog 315/11:48 Monday, October 5th, 2020

One of my lockdown activities has been extending my reading of the great Italian writer Italo Calvino, which included his If On A Winter's Night A Traveller, a serio-comic, postmodern novel about books. The narrator (perhaps also the reader) begins a novel that ends abruptly because it's been wrongly bound with duplicated sections. Returning it to the bookshop he meets a girl with a similarly defective copy, and together they set out to finish the story, which becomes ever more complicated. A bewildering sequence of partial novels by different (or not) authors about sex, war, spies, all totally gripping, none ever proceeding to an ending. It's not another book about the dreaded abstraction 'narrative', but about the material existence of books, which had been on my mind for two reasons.

Back in Idealog 227, 2013, I recounted my adventures trying to publish my book Sampling Reality, about the intersection of philosophy, information theory and neuroscience – as a Kindle Book, using a then state-of-the-art conversion tool called Calibre. I was beset by problems like all special characters displaying as '⎽' and suffered agonies trying to get table-of-contents to work, employing every conceivable different input format like docx, pdf, rtf, odf and more. I eventually got it accepted, and it’s since failed to disturb the best-sellers list, but at least it’s there.

Reflecting on this a few weeks ago I’d decided perhaps to give the book away as a PDF, when almost immediately after this thought occurred to me, Kindle Direct Publishing emailed to announce that they’ve just launched a paperback book print service in addition to ebooks. Too good to ignore, I set about downloading the new version of their production tool Kindle Create. It’s available for PC and Mac only so I installed it onto my Windows laptop, which has lain unused except as a print server for several years, and has a developed a sporadic problem involving a locust-like system process called ‘Runtime Broker’ that gobbles CPU and memory until I squash it in Task Manager.

I’d far rather have done the conversion on my Chromebook, but I soldiered on in semi-crippled Windows and it went better than Calibre, give or take fixing the odd missing contents item. I uploaded the resulting .kcb file to KDP, only to be curtly told it wouldn’t fit into the default 6” x 9“ page size. I’d never considered page size with my ebook of course. Having seen enough of Task Manager, I went downstairs to my Chromebook and reformatted the original .docx in Google Docs, emailed it to myself and went back upstairs to re-do the Kindle Create, which was now accepted. Made a new cover using Kindle Cover Creator, which went well, submitted it all and 72 hours later was given a PDF proof and told my paperback, now 191 pages long rather than 155, has been published. Result.

The second bookish adventure involved reviewing Thomas Piketty’s 1093-page Capital And Ideology for The Political Quarterly. The review copy arrived and is the size of two house-bricks and at 1.7kg almost as heavy. My normal mode of reading, flat on my back on the sofa, was entirely out of the question, and even propping it up on the table was problematic - laid flat it won’t stay open at your page, propped up it slides downwards. I purchased a neat little lectern on Amazon, made in Germany from bamboo and chromed wire.

The book’s spine width and weight were too much for it until I bent the wires into a different shape, and then constructed an ingenious – if I say so myself – system of rubber bands and grommets to hold the pages open. After a week or so of this I contacted Belknap and asked whether they had a Kindle edition. They didn’t but their charming UK PR sent me a PDF instead. It’s also big, at 25Mb, and unusably slow in Adobe Reader, but fortunately I have a better PDF viewer, the marvellous Chrome extension PDF.js. A community-driven GitHub project, built with HTML5 and available free from the Web Store, it’s maintained by Rob Wu (PDF Viewer Chrome extension) who describes it thus: “Our goal is to create a general-purpose, web standards-based platform for parsing and rendering PDFs.”

And oh boy does it parse, searching the huge book fast enough to be my principal means of navigation, and its page zoom is far nicer than Adobe’s too. Over the months my review took I barely touched the printed book again. So what exactly is the postmodern moral of this story? Are e-books finished, are they still the future, or are they just yet another tool that has its place when applied to the right problems?

Tuesday, 9 March 2021

JUST A PLACEBO?

Dick Pountain /Idealog 314/ 06 Sep 2020 10:37


It feels as though the Covid pandemic has made virologists and epidemiologists of all us (except those who deny it’s happening of course). Suddenly people are following vaccine development with sport-like attention and discussing the difference between cytokine and bradykinin storms at the bus-stop. Or perhaps that’s just my circle of hypervigilant friends…

This interest is of course highly practical, since returning to any semblance of normal life depends to a great extent on success in a vaccination program. But I sense there’s another dimension to it: it makes us feel better when we understand what’s happening to us, gives us a sense (perhaps illusory) of control. This effect of increased confidence can in some cases affect our real bodily functions, when it’s referred to as ‘the placebo effect’.

Medical science has only fairly recently started to take the placebo effect seriously, and its power appears more remarkable the more is discovered. The effect was dismissed for a long time for the very good reason that it appears to conflict with the central dogma that separates science from magic, namely that the mind cannot *directly* affect matter (without which we’d still be using Eye Of Newt instead of dexamethasone). That’s changing as we learn about the real material pathways that exist between software processes in the brain (that is, thoughts) and bodily processes. These pathways are mostly chemical rather than electrical, depending upon hormones and neurotransmitters distributed via the bloodstream. Incidentally, this is one more reason why the pursuit of Artificial Intelligence will remain stunted so long as it treats intelligence solely as a computational function of the brain, ignoring the intimate two-way communication between brain and the rest of the body’s organs.

Acceptance of the placebo effect grew with the pharmaceutical industry as it introduced drug trials and discovered that a placebo (from the Latin ‘to please’) – that is a fake pill, often just sugar – could sometimes produce an effect similar to the real drug. At first explanations were purely psychological, concerned with expectation: if you *expect* a pill to cure your headache then it might. The placebo effect was and remains a problem for drug trials, since untangling it from the real drug effect is difficult. There’s also an opposite, ‘nocebo effect’ where patients who are informed of possible side effects of a drug can experience or intensify them.

Psychological explanations raise the difficult question over whether such effects are ‘merely’ imaginary, or are physical, a question that also arises about illnesses that medicine suspects may be ‘psychosomatic’. The emphasis nowadays is shifting, not to discount psychological explanation entirely but to reveal how mind actually affects bodily systems. Placebo analgesia occurs when the mental expectation of relief stimulates the limbic system to release hormones called endorphins which behave like opiates. Expecting an antidepressant effect can release dopamine and thus improve mood, ditto for insulin and blood sugar or vasopressin and blood pressure. It’s even been demonstrated that patients can be conditioned, Pavlov-style: administer a drug in a drink with a distinctive taste, and if the drug is removed merely tasting the drink may produce its effect. In short, the placebo effect reveals yet another bodily system – like the immune, muscular and gastro-intestinal systems – that functions as a computational control system separate from and in parallel with the brain.

These placebo pathways can be trained and nurtured to an extraordinary extent, so that they become the basis of practices like yoga and acupuncture, or even the source of what would have once been called ‘miracles’. Wim Hof, a 60-year old Dutchman, can immerse himself in ice water for 45 minutes and swim under 50 yards under the ice of a frozen lake, by conditioning his breath control. Free divers train themselves into feats of breath-holding approaching 20 minutes. Similar feats of ‘mental’ pain relief are routinely reported both in wartime and among marathon runners. An Italian placebo scientist called Fabricio Benedetti gave weightlifters what he told them was a performance-enhancing drug, actually a placebo. He also secretly gave them lighter weights, which convinced them that the drugs were working. When he surreptitiously replaced the normal weights, the muscular force they were able to exert increased while their perceived fatigue remained the same.

I suspect there’s a lot more yet to discover about the placebo effect and its pathways, with enormous consequences not just for medicine but for sport, and everyday life. There’s already evidence that it can even stimulate the immune system, though whether it could ever be trained to resist infections like coronavirus seems pretty unlikely at the moment. Such slim hopes are certainly no excuse either to slacken the effort to develop vaccines, nor to stop persuading anti-vaxxers that they will need to accept them.

SHOW MUST GO ON

 


Dick Pountain /Idealog 313/ 17 Aug 2020 10:32


Last night I ‘attended’ a superb jazz session by world-class musicians Wayne Shorter, Herbie Hancock, Dave Holland and Brian Blade. Admittedly it happened 16 years ago – in the Große Konzertscheue, Salzau, Germany -- but I heard it almost as well, and saw it far better, than if I’d been there. My sofa is easier on the bum than most hall seats. It was a free YouTube video, in HD quality, which I Chromecast to my LG smart TV that pushed the sound through my vintage hifi system (excellent though hardly audiophile: Denon amp, Castle speakers). The camerawork was exceptional, in the German manner, so I saw the players’ fingers on their instruments and facial expressions in a close-up never experienced at a live gig. I also avoided queuing for the cloak-room, and being surrounded by people eating steak and chips and chattering instead of listening. So was this virtual concert a satisfactory replacement for the real thing?


I won’t go all gushy about the excitement of travelling in anticipation, about sharing an enthusiasm with other warm, breathing human-beings (which was once true) but will instead focus on more pragmatic considerations. First off, those musicians got as good as they are through a lifetime of playing to live audiences in clubs all over America and Britain (Holland is English), being paid a pittance by tight-fisted promoters. Are kids coming up today via Logic-Pro-on-bedroom-laptop and social media going to develop similar or equivalent skills: only time will tell, but many YouTube channels suggest perhaps not.


Secondly, can a viable music scene be maintained through payment for online performance? I didn’t pay for that Salzau video, and had YouTube been charging I probably wouldn’t have watched it, not knowing how good it was going to be. On the other hand I frequently pay £40+ a ticket to see acts at the South Bank, Jazz Cafe or Ronnie Scott’s. I don’t know what percentage of that gets to the musicians, but nor do I know what slice (if any) of YouTube’s ad revenue went to them for that video.


This applies even more so in the world of classical music. During lockdown in June I watched a week of excellent lunchtime concerts streamed from the Wigmore Hall, including a staggeringly fine ‘Winterreise’ by Mark Padmore and Mitsuko Uchida. The visibly empty seats brought home frighteningly just what the virus is doing to us. As regular attendees at the Wigmore we like to sit stage-side for which we pay £12 to £20 a head. I didn’t pay that for all those streamed concerts, though I did make a one-off donation.
The brutal truth is that the psychology of paying for streamed entertainment is very different from paying for live entertainment. Rightly or wrongly, you are unlikely to pay as much to watch from your own sofa, providing your own refreshment, as you would to travel to a special event at a concert hall or club. And even the alternative ways to pay for online entertainment can be fraught because of the distinction between pay-per-view and subscription.
Streaming has two huge advantages, instant access without travelling, and a vast repository of past performances. Instant access can make it possible to sample performances that you wouldn’t normally consider, and hence be educated and change your tastes - but only if it’s cheap enough that quitting ones you dislike after a minute or two doesn’t hurt too much. That  

was the difference between Spotify and Apple’s now defunct iTunes. I’m happy to pay £10/month for Spotify Premium which I use every day, not just listening to favourites while walking, but for finding new music or researching all versions of some tune. I wouldn’t do any of that were I paying per track.


Movies and TV are more complicated. I don’t subscribe to Netflix, Prime, Hulu or the like, because they don’t have enough of what I like to justify another monthly bill. But I do buy or rent one-off showings of movies – for example hard-to-find oldies like ‘Tampopo’ or ‘Babette’s Feast’ – and I use BBC iPlayer and All4 to binge watch series (I’ve pigged out on all seasons of ‘Line Of Duty’, and all seven years of ‘30 Rock’).

If coronavirus changes the way we consume entertainment forever, market forces alone are unlikely to save ‘the talent’. The print publishing industry faced this problem for years over library lending, and they came up with PLR (Public Lending Rights) and ALCS (Authors' Licensing and Collecting Society) which collect royalties on behalf of authors fairly efficiently. I suspect similar institutions will need to be cobbled together to collect revenues from online service providers on behalf of musicians, and even perhaps starving Hollywood moguls (joke alert)...





Tuesday, 19 January 2021

END OF MOORE’S LAW (AGAIN)

 Dick Pountain/Idealog 312/15 Jul 2020 01:58

It’s now a good 11 years since I last wrote a column (Idealog 180) about the end of Moore’s Law, in which time the number of transistors on a chip must have grown at least 100-fold. As I said in that column (just as in the one four years earlier), predicting the end of Moore’s Law is a mug’s game. It’s a bit like predicting The Second Coming or the arrival of a Covid-19 vaccine. In a 1997 article for Byte I’d predicted that lithographic limits and quantum effects would flatten the curve below 100 nanometre feature size, and I was only off by one order of magnitude which almost counts as a win in this futile race. Intel’s latest fabrication plant, built to produce chips with minimum 10 nanometer feature size was very late indeed and only started delivering chips in 2019, five years after the previous 14nm generation of chips.

However in the last few months a chorus of highly-qualified commentators have been declaring that this time it’s for real: high performance computing pioneer Charles Leiserson of MIT has remarked that “Moore’s Law was always about the rate of progress, and we’re no longer on that rate.” It’s not just those physical limits on feature size I was writing about, but economics too. The cost of building a new fab has been rising 13% year-on-year and is headed north of $16 billion, at precisely the time that Donald Trump, great tech entrepreneur that he is, is calling for US companies to bring chip fabrication back home as part of his trade-war on the Far East. Only Intel, AMD and Nvidia can even contemplate a next lower level of feature size (and Nvidia’s not all that sure).

Of course reaching bottom in feature size doesn’t mean the end of all progress in computing power. The effect of Moore’s Law itself encourages software bloat - why bother writing efficient code if next year’s chip will speed up today’s crappy code. This is a problem waiting to be tackled: most of today’s commercial software could probably be sped up 100-fold by a decent rewrite. But another problem is that rewriting code is almost as expensive as fab-building.

Parallelism looked like the solution for a long time, and it sort of was: even the cheapest mobile phones today use multi-core processors, and AMD is selling 16-core chips right now. The thing is, the more cores you add to a chip, the more of the silicon real estate gets eaten up by interconnect, and what’s worse, the model of parallelism employed for x86 family processors isn’t automatically exploitable by old software without a rewrite.

There are two hugely different groups of people who need the extra power of multiple cores: games vendors and AI developers. The former have the cash to rewrite their flagship games for each CPU generation. The latter need far more, and far more flexible, parallelism than these chips offer, and so are headed off along a path toward special-purpose processors (I wrote about some of their novel architectures in Idealog 301 after the 2019 CogX show). Such ‘IPUs’ can speed up the kind of massive matrix and convolution calculations performed during deep-learning by several orders of magnitude - the problem is they can’t run Animal Crossing, or Google Chrome, or Microsoft Word. They are not general-purpose processors. They are however potentially incredibly lucrative, since they will eventually end up in every mobile phone, Alexa-style interface or self-driving vehicle. Venture capital is queuing up to invest in them just as mainstream processors begin to look less like a hot tip. Neil Thompson, an economist at MIT’s AI centre, has just written a paper called “The Decline of Computers as a General Purpose Technology” which gives you an idea of the drift.

Moore’s Law is underpinned by the scaling behaviour of CMOS fabrication technology, and it’s that we’re approaching the end of. Professor Erica Fuchs, of the engineering and public policy at Carnegie Mellon university, worries that a successor technology with equally benign scaling properties, that could maintain Moore’s Law for general-purpose chips, is as yet unknown and may take years of basic research and development to find with no guarantee of success. Candidates might include carbon nanotubes, graphene transistors, spintronics or even the dreaded qubits but none of these are obvious replacements for CMOS scaling. She calls for a huge boost in public research funding to replace all the venture capital that’s being diverted into special-purpose AI chips. Unfortunately the colossal cost of the Covid-19 pandemic is likely to make that a very hard sell indeed, given that most politicians have little idea of what chips do at all, let alone the subtle distinctions between special and general purpose ones.

[Dick Pountain plans to leave it for 16 years before he writes another Moore’s Law column]






















BAND OF BIG BROTHERS

 Dick Pountain/ Idealog311/ 9th June 2020 16:14:55

Have you spent time during lockdown worrying about what life will be like after this pandemic ends? For me personally lockdown itself hasn’t been a huge deal as I’ve been working from home for best part of 30 years, but I’m noticing other changes already. One example, this time last year I devoted two of these columns to an AI conference, CogX, which was held a short stroll down the canal from my home in King’s Cross. Well, I’m there again this year as I write this, but virtually via live streaming. That works well enough, in fact it’s easier to hear the speakers and read their slides than it was sitting in a tent in the rain. But I can’t help notice that there aren’t so many big hitters from US labs on the roster this time. Surely that couldn’t be because an invitation to sit on your sofa with a laptop isn’t so attractive as a free air trip across the Atlantic?


Actually the loss of air travel worries me less than most too, because 12 years of too-and-fro from Italy already squelched much of the romance of flying for me. What worries me more is our increasing reliance on online purchasing, while the only shops open were food shops. And it’s not just the way Amazon is taking over ever larger chunks of the retail sector, it’s that all of the US digital giant corporations are garnering profit and power from the global lockdown.


I’ve never been one to promote conspiracy theories myself, and as I said in last month’s column I’ve willingly placed myself under the tutelage of Google to a degree that would make some people nervous. But there’s no escaping the fact that Amazon, Google, Microsoft, Facebook, Uber and the like all have ambitions to provide services which in many parts of the world are either monopolised or heavily regulated by the nation-state. These corporations are all working hard to penetrate the healthcare, education, security and transport sectors, by providing innovative and ‘disruptive’ services that parallel what they’ve achieved in the retail, entertainment and social media sectors – and there’s no doubting that the extraordinary infrastructures and AI capabilities they’ve all invested in are indeed more efficient than many, even most state equivalents. I’ve often wondered how the NHS would look if it had a platform of the efficiency of Amazon’s to connect patients, GPs and hospitals.


Trouble is, they remain unelected, commercial enterprises who have no other commitment than to their shareholders, and it’s unthinkable – even to extreme free-market libertarians like our current government – to give them such massive control over our economy. Another problem is that they notoriously evade paying fair taxes in the territories they operate, thus depriving the public coffers of funds needed to compete with their services. There’s a potential solution to that, short of nationalising them or taxing them into retreat: governments with sufficient resolve could strike deals where these corporations pay a fairer tax whack in part, through partnerships that offer their platforms free to improve existing public services.


That however would require the state itself not to be evil, and seeing how some are already applying digital tech to integrate welfare, security and taxation systems isn’t encouraging. India’s Aadhaar system, Singapore’s ‘Smart Nation’ and Alibaba’s cooperation with Chinese local government to run their Social Credit system all present potential threats to personal

liberty. There are questions over how well these systems actually work, but they certainly grant the state a scary degree of extra knowledge and power over citizens: participating in a demonstration, perhaps even voting for an opposition party, can be punished by loss of benefits.


While George Orwell was writing ‘1984’ the first modern 625-line television system was being introduced (in the Soviet Union as it happens) and he foresaw the effect this new electronic medium would have on authoritarian societies. However because TV is only a one-way channel, his picture of Big Brother’s regime was somewhat stunted. Perhaps had Orwell lived to see our two-way, social media, internet, he might have concluded it had democratic potential? Perhaps he would, but we know better...


Deepfakes and disinformation are neutralising whatever democratic potential the net might have had (pioneered as it happens by those same Russians: Putin’s KGB background gives him a tech savvy that’s notably lacking among Western leaders). The virus keeps us all at home, staring into our screens, shopping, and wallowing in streams of false information. In 1988, Guy Debord nailed it in ‘Comments on the Society of the Spectacle’:

“Networks of promotion/control slide imperceptibly into networks of surveillance/disinformation. Formerly one only conspired against an established order. Today, conspiring in its favor is a new and flourishing profession.”


[Dick Pountain oft times quests through the deep, dark labyrinth of Amazon’s menus to slay a free Prime membership on its last day]



Sunday, 4 October 2020

HEAD IN THE CLOUDS

 Dick Pountain/Idealog 310/13:53 9 May 2020

When a couple of issues ago Editor Tim asked for tips for newly-working-at-home readers, mine was 'buy a Chromebook', which forced me to face up to how far I've drifted from the original Personal Computer Revolution. That was about everyone having their own CPU and their own data, but I've sold my soul to Google and I can't say I miss it. When first turned on, my Asus C301 Chromebook sucked all my personal data down automatically within five minutes, because it was all in Google Keep or on Google Drive. I do still have a Lenovo laptop but rarely use it, except via those same Google apps, and I don't miss the excitement of Windows updates one bit.  

My love for Google Keep isn't a secret to readers of this column, and it only grows stronger as new features like flawless voice dictation and pen annotations get added. Remember I'm someone who spent 30+ years looking for a viable free-form database to hold all the research data - magazine articles, pictures, diagrams, books, papers, web pages, links - that my work makes me accumulate. The task proved beyond any of the database products I tried, with Idealist, AskSam and the Firefox add-on Scrapbook lasting longer than most. Those with long memories might remember how Microsoft promised to put the retrieval abilities I need right into Windows itself, via an object-oriented file-system that they eventually chickened-out from. 

Keep's combination of categories, labels, colour coding and free text search gives me the flexible retrieval system I've been seeking, though it still isn't quite enough on its own: while it can hold pictures and clickable links they're not so convenient as  actual web pages. For a couple of decades I religiously bookmarked web pages, until my bookmarks tree structure became just a unwieldy as my on-disk folders. Nowadays I just save pages to Pocket, which is by far the most useful gadget I have after Keep. A single click on Pocket's icon on the Chrome toolbar grabs a page, fully formatted complete with pictures and a button to go to the original if needed, so making bookmarks redundant. I use the free version which supports tags similar to Keep's labels, but there's a paid-for Premium version with a raft of extra archival features for professional use. And like Keep, Pocket is cross platform so I can see my page library from Windows or a phone. 

Does the cloud make stuff easier to find? Within reason, yes.  Save too many pages to Pocket and and, as with bookmarks, you've merely shifted the complexity rather than removing it. Sometimes I fail to save something that didn't feel important at the time, then discover months later that it was, and Chrome's history function comes in handy then. I use it most to re-open recent tabs closed by mistake (I have an itchy trigger-finger) but by going to https://myactivity.google.com/ I can review searches years into the past, if I can remember at least one key word. Failing that, it's plain Google Search or the Internet Archive's Wayback Machine, recently released as a Chrome extension.

My music nowadays comes entirely from Spotify, end of. My own photographs remain the main problem. I take thousands and store them both on cloud and local hard disk, organised by camera (eg. Sony A58, Minolta, Lumix), then location (eg. Park, Italy, Scotland). I've tried those dedicated photo databases that organise by date, but find them of very little help: place reminds me far more effectively than time. My best pictures still go onto Flickr, tagged very thoroughly to exploit its rather superior search functions (it can even search by dominant colour!) Pictures I rate less Flickr-worthy I sometimes put on Facebook in themed Albums which also helps to find them The technology does now exist to search by image-matching, but that's mostly used by pros who need to spot theft or plagiarism. I can only express what I'm looking for in words, like 'Pip fixing the Gardner diesel engine'.  

What's required is a deep-AI analysis tool that can facially identify humans from their mugshots in my Contacts, recognise objects like tables, chairs or engines, can OCR any text in a picture (like 'Gardner' embossed on a cylinder block) and then output its findings as searchable text tags. It wouldn't surprise me if some Google lab is working on it. I do realise that were Google to go bust, or the internet close down then I'd be stuck with local data again, but if things get that bad then foraging for rats and cats to eat will probably be a higher priority. Right now my tip would still be, keep your feet on the ground and your data in the cloud(s)...

 





 





Saturday, 12 September 2020

A BATTLE OF OLOGIES

 Dick Pountain/ Idealog309/ 4th April 2020 08:01:58


Stewart Brand’s famous epigram “Information wants to be free” has been our collective motto for the last three decades, but few of us remember that it wasn’t an isolated phrase and was accompanied by, for example, “The right information in the right place just changes your life”. During our current nightmare we’re learning that here in the world of matter many other things want to be free that shouldn’t, like serial killers and mosquitoes and viruses, and that controlling information about them has become critical.


Across the world governments and their health organisations are trying to cope with the COVID-19 pandemic but discovering that in the age of TV news and social media it’s impossible to hide anything. We’re witnessing a war develop between two ‘ologies’, epidemiology and social psychology. The coronavirus has very particular characteristics that seem designed by some malevolent deity to test the mettle of pampered citizens of post-modern information societies. There’s a malign cascade of statistical percentages. Some experts estimate that if we do nothing, around 60% of the world population would catch it – roughly the odds of a coin toss, Bad News. But if you do catch it 80% will experience nothing worse than a cold – Good News. But of the 20% who do have worse symptoms, anywhere from 1 to 5% will die, roughly the odds of Russian Roulette – Bad, Bad News. The virus is highly contagious and thus prone to spread exponentially, but not so lethal as to be self-limiting like Ebola or Marburg.


These facts have turned two issues into political dynamite, namely ‘herd immunity’ and virus testing. An exponential virus epidemic spreads rather like a nuclear fission chain reaction, and the way to control it is the same – by introducing a moderator that can reduce the flow of neutrons or virus particles between successive targets. In a viral pandemic herd immunity – that is, many people getting it, surviving and becoming immune – is the best such moderator, and is what happens most often. An effective vaccine is a catalyst that spreads such immunity more quickly and reliably. The problem is that unlike uranium atoms, human beings have minds, and the effect on those minds is nowadays more important than cold percentages.


The measures that are being taken by most governments are self-quarantine and social distancing (avoiding contact with other people) which act to moderate the rate of spread, in order to avoid swamping health systems with too many critical cases at once. In most countries these measures depend upon the voluntary cooperation of citizens. There are already tests for the presence of live virus, and there will soon be reliable tests for antibodies in survivors, but there’s controversy over how widely to apply them.


Epidemiologists need good data to check whether isolation measures are working, to get an accurate picture of the lethality rate, to model the spread and calculate how best to allocate finite palliative care resources. And, as American professor Zeynep Tufekci points out in The Atlantic magazine (https://www.theatlantic.com/technology/archive/2020/04/coronavirus-models-arent-supposed-be-right/609271/), whenever a government acts upon the recommendations of the modellers, those actions change the model.


But citizens would very much like to know whether they have caught the virus and need urgent treatment, or had the mild form and are immune. It would technically be possible to test the whole population, but it’s neither economically nor politically sensible. The cost would be enormous, and it would conflict with the principle of isolation if people had to travel to test centres but impractical if testing vans had to visit every cottage in the Hebrides. Also no tests are perfect, and both false positives and negatives could have unpleasant consequences.So mass testing isn’t feasible and would be a poor use of scarce resources – but even if it were possible it might still be counter productive. Once everyone who’s had mild COVID-19 and achieved 'herd immunity’ knows that for sure, their incentive to continue with isolation might fade away, and worse still they might come to resent those who require it to be continued.


Masks are another psychological issue: medical opinion is that cheap ones aren’t effective and that good ones are only needed by those who deal directly with infected patients. But wearing even a cheap ineffective mask makes a social statement: actually two statements, “I care about me” and “I care about you” (which predominates in each case becoming obvious from other body language).


Perhaps the best we can conclude is that total freedom of information isn’t always a good thing in emergencies like this, but that social media make it hard to avoid. We’re slipping into the realm of Game Theory, not epidemiology.


POD PEOPLE

Dick Pountain /Idealog 366/ 05 Jan 2025 03:05 It’s January, when columnists feel obliged to reflect on the past year and who am I to refuse,...