Tuesday 19 January 2021

END OF MOORE’S LAW (AGAIN)

 Dick Pountain/Idealog 312/15 Jul 2020 01:58

It’s now a good 11 years since I last wrote a column (Idealog 180) about the end of Moore’s Law, in which time the number of transistors on a chip must have grown at least 100-fold. As I said in that column (just as in the one four years earlier), predicting the end of Moore’s Law is a mug’s game. It’s a bit like predicting The Second Coming or the arrival of a Covid-19 vaccine. In a 1997 article for Byte I’d predicted that lithographic limits and quantum effects would flatten the curve below 100 nanometre feature size, and I was only off by one order of magnitude which almost counts as a win in this futile race. Intel’s latest fabrication plant, built to produce chips with minimum 10 nanometer feature size was very late indeed and only started delivering chips in 2019, five years after the previous 14nm generation of chips.

However in the last few months a chorus of highly-qualified commentators have been declaring that this time it’s for real: high performance computing pioneer Charles Leiserson of MIT has remarked that “Moore’s Law was always about the rate of progress, and we’re no longer on that rate.” It’s not just those physical limits on feature size I was writing about, but economics too. The cost of building a new fab has been rising 13% year-on-year and is headed north of $16 billion, at precisely the time that Donald Trump, great tech entrepreneur that he is, is calling for US companies to bring chip fabrication back home as part of his trade-war on the Far East. Only Intel, AMD and Nvidia can even contemplate a next lower level of feature size (and Nvidia’s not all that sure).

Of course reaching bottom in feature size doesn’t mean the end of all progress in computing power. The effect of Moore’s Law itself encourages software bloat - why bother writing efficient code if next year’s chip will speed up today’s crappy code. This is a problem waiting to be tackled: most of today’s commercial software could probably be sped up 100-fold by a decent rewrite. But another problem is that rewriting code is almost as expensive as fab-building.

Parallelism looked like the solution for a long time, and it sort of was: even the cheapest mobile phones today use multi-core processors, and AMD is selling 16-core chips right now. The thing is, the more cores you add to a chip, the more of the silicon real estate gets eaten up by interconnect, and what’s worse, the model of parallelism employed for x86 family processors isn’t automatically exploitable by old software without a rewrite.

There are two hugely different groups of people who need the extra power of multiple cores: games vendors and AI developers. The former have the cash to rewrite their flagship games for each CPU generation. The latter need far more, and far more flexible, parallelism than these chips offer, and so are headed off along a path toward special-purpose processors (I wrote about some of their novel architectures in Idealog 301 after the 2019 CogX show). Such ‘IPUs’ can speed up the kind of massive matrix and convolution calculations performed during deep-learning by several orders of magnitude - the problem is they can’t run Animal Crossing, or Google Chrome, or Microsoft Word. They are not general-purpose processors. They are however potentially incredibly lucrative, since they will eventually end up in every mobile phone, Alexa-style interface or self-driving vehicle. Venture capital is queuing up to invest in them just as mainstream processors begin to look less like a hot tip. Neil Thompson, an economist at MIT’s AI centre, has just written a paper called “The Decline of Computers as a General Purpose Technology” which gives you an idea of the drift.

Moore’s Law is underpinned by the scaling behaviour of CMOS fabrication technology, and it’s that we’re approaching the end of. Professor Erica Fuchs, of the engineering and public policy at Carnegie Mellon university, worries that a successor technology with equally benign scaling properties, that could maintain Moore’s Law for general-purpose chips, is as yet unknown and may take years of basic research and development to find with no guarantee of success. Candidates might include carbon nanotubes, graphene transistors, spintronics or even the dreaded qubits but none of these are obvious replacements for CMOS scaling. She calls for a huge boost in public research funding to replace all the venture capital that’s being diverted into special-purpose AI chips. Unfortunately the colossal cost of the Covid-19 pandemic is likely to make that a very hard sell indeed, given that most politicians have little idea of what chips do at all, let alone the subtle distinctions between special and general purpose ones.

[Dick Pountain plans to leave it for 16 years before he writes another Moore’s Law column]






















BAND OF BIG BROTHERS

 Dick Pountain/ Idealog311/ 9th June 2020 16:14:55

Have you spent time during lockdown worrying about what life will be like after this pandemic ends? For me personally lockdown itself hasn’t been a huge deal as I’ve been working from home for best part of 30 years, but I’m noticing other changes already. One example, this time last year I devoted two of these columns to an AI conference, CogX, which was held a short stroll down the canal from my home in King’s Cross. Well, I’m there again this year as I write this, but virtually via live streaming. That works well enough, in fact it’s easier to hear the speakers and read their slides than it was sitting in a tent in the rain. But I can’t help notice that there aren’t so many big hitters from US labs on the roster this time. Surely that couldn’t be because an invitation to sit on your sofa with a laptop isn’t so attractive as a free air trip across the Atlantic?


Actually the loss of air travel worries me less than most too, because 12 years of too-and-fro from Italy already squelched much of the romance of flying for me. What worries me more is our increasing reliance on online purchasing, while the only shops open were food shops. And it’s not just the way Amazon is taking over ever larger chunks of the retail sector, it’s that all of the US digital giant corporations are garnering profit and power from the global lockdown.


I’ve never been one to promote conspiracy theories myself, and as I said in last month’s column I’ve willingly placed myself under the tutelage of Google to a degree that would make some people nervous. But there’s no escaping the fact that Amazon, Google, Microsoft, Facebook, Uber and the like all have ambitions to provide services which in many parts of the world are either monopolised or heavily regulated by the nation-state. These corporations are all working hard to penetrate the healthcare, education, security and transport sectors, by providing innovative and ‘disruptive’ services that parallel what they’ve achieved in the retail, entertainment and social media sectors – and there’s no doubting that the extraordinary infrastructures and AI capabilities they’ve all invested in are indeed more efficient than many, even most state equivalents. I’ve often wondered how the NHS would look if it had a platform of the efficiency of Amazon’s to connect patients, GPs and hospitals.


Trouble is, they remain unelected, commercial enterprises who have no other commitment than to their shareholders, and it’s unthinkable – even to extreme free-market libertarians like our current government – to give them such massive control over our economy. Another problem is that they notoriously evade paying fair taxes in the territories they operate, thus depriving the public coffers of funds needed to compete with their services. There’s a potential solution to that, short of nationalising them or taxing them into retreat: governments with sufficient resolve could strike deals where these corporations pay a fairer tax whack in part, through partnerships that offer their platforms free to improve existing public services.


That however would require the state itself not to be evil, and seeing how some are already applying digital tech to integrate welfare, security and taxation systems isn’t encouraging. India’s Aadhaar system, Singapore’s ‘Smart Nation’ and Alibaba’s cooperation with Chinese local government to run their Social Credit system all present potential threats to personal

liberty. There are questions over how well these systems actually work, but they certainly grant the state a scary degree of extra knowledge and power over citizens: participating in a demonstration, perhaps even voting for an opposition party, can be punished by loss of benefits.


While George Orwell was writing ‘1984’ the first modern 625-line television system was being introduced (in the Soviet Union as it happens) and he foresaw the effect this new electronic medium would have on authoritarian societies. However because TV is only a one-way channel, his picture of Big Brother’s regime was somewhat stunted. Perhaps had Orwell lived to see our two-way, social media, internet, he might have concluded it had democratic potential? Perhaps he would, but we know better...


Deepfakes and disinformation are neutralising whatever democratic potential the net might have had (pioneered as it happens by those same Russians: Putin’s KGB background gives him a tech savvy that’s notably lacking among Western leaders). The virus keeps us all at home, staring into our screens, shopping, and wallowing in streams of false information. In 1988, Guy Debord nailed it in ‘Comments on the Society of the Spectacle’:

“Networks of promotion/control slide imperceptibly into networks of surveillance/disinformation. Formerly one only conspired against an established order. Today, conspiring in its favor is a new and flourishing profession.”


[Dick Pountain oft times quests through the deep, dark labyrinth of Amazon’s menus to slay a free Prime membership on its last day]



SOCIAL UNEASE

Dick Pountain /Idealog 350/ 07 Sep 2023 10:58 Ten years ago this column might have listed a handful of online apps that assist my everyday...