Tuesday 31 January 2023

COMMONPLACE

Dick Pountain /Idealog 336/ 05 Jul 2022 12:30


A ‘commonplace book’, according to my dictionary, is “a book into which notable extracts from other works are copied for personal use.” My late friend and long-term associate Felix Dennis used to keep one, a largish leather-bound album that went with him on all his travels, filled with clippings, photos and hand-written notes of whatever happened to capture his attention (often poems). I myself was never inclined to note-taking in the era of paper – I’ve never kept a diary and only started to collect common (and not so common) places after the arrival of the personal computer. When I did start, it was back in the clunky, pre-Windows days, thanks to a nifty utility from Borland called ‘Sidekick’  which popped-up a window over any other DOS application so one could dash off a quick note. 

Once in Windows-land I sought out ever more capable free-form text databases – Idealist, AskSam, Threadz Organiser, Bonsai and half a dozen more – to soak up my inspirations about work-in-progress and enable me to find them again later. Stuck with each for a couple of years, but as Windows’ hierarchical file system gradually became more capable, especially for indexed search, I found myself relying ever less on databases and their pesky file formats. Then the rise of The Web changed everything, and not merely because I could store stuff in The Cloud but because more and more of my inspirations came from The Cloud. 

At first it was search engines like Lycos and AltaVista, then the all-conquering Google, but retaining search results meant links and bookmarks, which had to be stored and organised themselves. Cue a fierce competition between Chrome, Firefox, Explorer and Opera for best hierarchical bookmark manager. There were apps that stored web pages locally, but only as hideous subdirectories holding scores of irritating HTML files. That all changed with Pocket, which stores links to wanted web pages in its own cloud in an almost transparent fashion by clicking a single icon. Pocket continues to be the core of my data hoard even after I abandoned Windows for a Chromebook. I still send articles containing matter I might need later there while I’m reading them, which ends up storing far more than I actually use (I do purge my Pocket list occasionally, and its own tagging and search facilities are good). But nowadays I also download really vital articles immediately into a local folder containing the project they’re aimed at (for example these columns). When I say ‘download’, what I actually mean is ‘print to PDF’ which captures an article with all its pictures and formatting intact. 

I don’t worry about the space textual ‘commonplaces’ occupy, now that 128 gig USB sticks are so cheap, but pictures are a more worrying matter. Photo organising for pictures that I take myself – whether local, or online like Flickr and Google Photos – isn’t the problem, it’s pictures I’ve just grabbed from the ‘net because they tickled me, and which I’m too idle to categorise, tag or even name properly so they can be easily found again. That said though, most of what I do nowadays (including this column) doesn’t involve any pictures. 

I review books for a political journal, the sort of books that sometimes require me to quote extensively from the text, and I discovered quite a while ago that asking for a Kindle or PDF edition of the review book is a great advantage, just for the searchability. In a huge tome like Thomas Piketty’s ‘Capital And Ideology’, being able to search and bookmark is a life-saver. At first Amazon was less than helpful in the matter of extracting book content and annotations: for example I found that notes made on my hardware Kindle or Android tablet could only be cut-and-pasted from within the PC version of the Kindle reader which I no longer use. 

Thankfully that’s all changed, and in a most helpful way. When I review books in more recent Kindle readers, in addition to searching and bookmarking pages I can highlight passages in four different colours, attach notes to them, copy them to the clipboard and even report typos and other errors back to the publisher. And this annotating activity gets automatically saved as a ‘Notebook’ which I can export either as plain text or in one of three approved academic footnote formats, and share wirelessly between my devices (Chromebook, phone or Galaxy Tab). Viewing a Notebook filtered by highlight colour can in effect turn it into a full-text database of colour-coded categories, just like those databases I played with in the bad old days. It can even export notes and annotations as a deck of flashcards, which I could one day use to perform acts of mass stupefaction on an unsuspecting audience… 


[Dick Pountain can recite the ‘fair use’ rules in his sleep (and sometimes does)]


AI, PROTEIN AND PASTA

Dick Pountain /Idealog 335/ 01 Jun 2022 01:55


I’ve devoted a lot of space in previous columns to questioning the inflated claims made by the most enthusiastic proponents of AI, the latest of which is that we’re on the brink of achieving ‘artificial general intelligence’ on a par with our own organic variety. I’m not against the attempt itself because I’m just as fascinated by technology as any AI researcher. Rather I’m against a grossly oversimplified view of the complexity of the human brain, and the belief that it’s comparable in any useful sense to a digital computer. And there are encouraging signs in recent years that some leading-edge AI people are coming to similar conclusions.  

There’s no denying the remarkable results that deep neural networks based on layers of silicon ‘neurons’ are now achieving when trained on vast data sets: the most impressive to me is DeepMind’s cracking of the protein folding problem. However a group at the Hebrew University of Jerusalem recently performed an experiment in which they trained such a deep net to emulate the activity of a single (simulated) biological neuron, and their astonishing conclusion is that such a single neuron had the same computational complexity as a whole 5-to-8 layer network. Forget the idea that neurons are like bits, bytes or words: each one performs the work of a whole network. The complexity of the whole brain suddenly explodes exponentially: another research group has estimated that the information capacity of a single human brain could roughly hold all the data generated in the world over a year. 

On the biological front, recent research hints as to how this massive disparity arises. It’s been assumed so far that the structure of the neuron has been largely conserved by evolution for millenia, but this now appears not entirely true. Human neurons are found to have an order of magnitude fewer ion channels (the cell components that permit sodium and potassium ions to trigger nerve action) than other animals, including our closest primate relatives. This, along with extra myelin sheathing, enables our brain architectures to be far more energy efficient, employing longer-range connectivity that both localises and shares processing in ways that avoid excessive global levels of activation. 

Another remarkable discovery finds that the ARC gene (which codes for a protein called Activity-Regulated Cytoskeleton-associated protein), known to be present in all nerve synapses, now plays a crucial role in learning and memory formation. So there’s another, previously unsuspected, chemically-mediated regulatory and communication network between neurons in addition to the well-known hormonal one. It’s thought that ARC is especially active during infancy when the plastic brain is wiring itself up.

In other experiments, scanning rat brains shows that activity occurs throughout most of the brain during the laying down of a single memory, so memory formation not confined to any one area like the hippocampus. Other work, on learned fear responses, demonstrates that repeated fearful experiences don’t merely lay down bad memories but permanently up-regulate the activity of the whole amygdala to make the creature temperamentally more fearful. In short, imagining the brain (or even the individual neuron) as a simple computer is hopelessly inadequate: rather it’s an internet-of-internets of sensors, signal processors, calculators and motors, capable not only of programming itself, but also of designing and modifying its own architecture on the fly. And just to rub it in, much of its activity is more like analog than digital computing. 

The fatal weakness of digital/silicon deep learning networks is the gargantuan amount of arithmetic, and hence energy, consumed during training, which as I’ve mentioned in a recent column leads some AI chip designers toward hybrid digital/analog architectures. The physical properties of a circuit, like current and resistance, perform the additions and multiplications on data in situ at great speed. However the real energy hog in deep learning networks is the ‘back-propagation’ algorithm used to teach new examples, which imposes enormous repetitions of the calculatory load.  

A more radical line of research is looking outside of electronics altogether, toward other physical media whose properties in effect bypass the need for back-propagation. The best known such medium is light: optically encode weights as different frequencies of light and use special crystals to apply these to the video input stream. This could eventually lead to the smarter, faster vision systems required for self-driving cars and robots. Another, far more unexpected medium is sound: researchers at Cornell are using vibrating titanium plates which automatically integrate learning examples supplied as sound waves by a process called ‘equilibrium propagation’: the complex vibration modes of the plate effectively compute the required transforms while avoiding the energy wastage of back-propagation. Of course the ultimate weird analog medium has to be spaghetti (which appeals to the Italian cook in me): see https://en.wikipedia.org/wiki/Spaghetti_sort


[Dick Pountain can confirm that the Spaghetti Sort works well with linguine and bucatini, but not with penne or fusilli] 

ANSI FANCY

Dick Pountain /Idealog 334/ 06 May 2022 10:01


During the darkest days of lockdown I kept myself amused mostly by practising my jazz guitar chords and by writing Python programs. I write a lot of small, off-the-cuff programs, for everything from updating phone codes or solving maths puzzles, to playing with simulations to do with Games Theory. Python is just the last of a long list of languages I’ve used –  Basic, Forth, Pascal, Lisp, Ruby are just a few of them – but the one I remember with most affection is Turbo Pascal. My programs are so small and ad hoc that it’s never worth investing too much time writing graphical user interfaces, so Windows was a nuisance rather than a liberation. 

Although I played with Visual Basic and Delphi for several years, eventually both metastasized into such baggy monsters that I dumped them. I wrote myself a little toolkit in Turbo Pascal 5 that created simple windows, menus and pick-box widgets from ASCII characters, quite good enough. I shared my code in Byte, and am proud to say I saw it on screens in more than one science lab where they had similar minimalist requirements. Now I use QPython 3.6 on Android, which has no native ability for colour or cursor control, just scrolling teletype output. There are plenty of add-on graphics libraries like Kivy, all way more than I need, and there’s a simple interface to Android dialogs and media components called SL4A which I do occasionally use but it’s still not what I had with Turbo. 

The impulse to recreate my Turbo widgets was finally triggered by of all things, Wordle. I do enjoy playing Josh Wardle’s clever and elegant little puzzle (though I deplore the posting of screenshots on Facebook or bragging about stats…) It’s just difficult enough to maintain an interest, but still simple enough to pose real questions over strategy, which is exactly what I love in a game. I never actually cheat but I do use tools that some purists might consider cheating, namely a mobile version of the Oxford Dictionary with wildcard search, and an Anagram Solver, and once I have three or four letters these will finish the job in a couple of minutes. Getting those three of four letters soon enough is what my Python program does, searching for effective first guesses using the known frequency distribution of letters in English at each position in a five-letter word: it outputs delightful combinations like AUDIT SNORE CLAMP and  CAMEO UNITS GRIND WHELK. I discovered that my old buddy David Tebbutt, one of the founding editors of Personal Computer World, is also fond of Wordle and it was he who sent me the letter frequencies list in a fine example of nerd-aid. (He also has access to Wordle’s own internal word list, but in a fit of hubristic rectitude I decided that was a step too far for me - I’ll make do with the Oxford).  

My Python Wordle Helper program’s scrolling teletype interface became a real bore, and while looking up how to write a clear-screen command using OS calls I discovered how to issue ANSI codes from Python, and hence how to do a 256-colour character-based terminal better than Turbo’s. QPython can in theory access the Linux curses library, but that refuses to work for me (and for quite a few other folk according to the forums) and in any case curses is pretty horrible. I set to and have now written a widget set that does everything I want, enabling single lines of code to invoke a box, a window, pick list, input box, progress bar, table or barchart. 

I very much doubt that my new widgets will prove as popular with nerds in science labs as those Turbo originals were – they were pre-Macintosh days, when IBM-compatible PCs running DOS were still ubiquitous in technical and process control contexts. I know that there are still far too many public institutions like hospitals and libraries which retain such dinosaurs, but they will be at least running Windows XP. Of course anyone under 70 is using a smartphone instead of a computer anyway, and an ANSI terminal on a smartphone screen is about as welcome as a turd on a sushi. So I’ll be keeping my little ANSI world to myself, going back and rewiring some of my old programs using the new interface. My 5-Card Drawer and 5-Card Stud Poker games actually ook rather splendid, and it’s a pity I can’t show you them here. But if for whatever bizarre reason you’re looking for a lightweight interface kit for some primitive device with a 160x36 character display, you know where to look, on my website at http://www.dickpountain.co.uk/home/computing/python-projects/ansi-library

[ Dick Pountain is so thankful that he programs as a hobby, not a job ]


HYBRID HYDROGEN

Dick Pountain /Idealog 333/ 08 Apr 2022 10:08


I write about energy fairly often in this column, mainly in the context of CPU efficiency or the foibles of mobile batteries, but now seems like the time for a wider look. Putin’s invasion of Ukraine has thrown the global fossil markets into chaos, while the UK government has just revealed its long-term (not entirely feeble) strategy for increasing nuclear and wind power over the next 8 years. So this time I’m talking about the long run. 

I’ve been a convinced supporter of both nuclear power – fission and ultimately fusion – and the ‘hydrogen economy’ for around 50 years when neither was at all fashionable. I won’t rehash all the problems both face, which are well-known and debated, but will instead present two indisputable facts. Firstly the primary mode of energy delivery in advanced Western economies is turning toward electricity (and demand for it will soon exceed supply); and secondly, hydrogen is an impractical fuel for road, and perhaps for air, transport. It’s terribly inflammable, hard to store and has a knack of diffusing out of containers and pipelines.

You only have to remind yourself of the horror of power-cuts, when your phone and laptop batteries have run out, your router and internet connection are down (and maybe your gas central-heating controller is going nuts). Then think about the fact that electric car sales are approaching take-off, thanks to the soaring price of petrol and diesel, concern over air pollution and climate change. You don’t need a degree in economics to figure out what happens to electricity demand if all car users go that route. One solution would be to distribute the generation of electricity more widely, using small-to-medium nuclear and on-shore wind power stations. But I suggest that the solution we need to be looking at instead is a hybrid one, in which we keep electricity generation centralised – perhaps coastal – using nuclear, solar and wind power to generate hydrogen by the electrolysis of water. Then, instead of trying to distribute that hydrogen as gas or liquid, both fraught with danger, use it to make metal hydrides that can be distributed in standardised battery-like modules. The national chain of petrol stations replaces its pumps with hydrogen-pack replacement and recycling, which also eventually displaces the too slowly growing national grid of EV charging points. Such a distribution network of hydride packs would efficiently store and distribute energy produced by intermittent sources like solar, wind and tide, and replace bottled gas in remote rural areas beyond the reach of town gas. Such a system would require bringing two existing research technologies to commercial fruition: metal hydrides and hydrogen fuel cells, and I think that both can be achieved with sufficient investment.

A fuel cell turns a fuel like hydrogen into electricity directly, combining it with oxygen from the air within an electrolyte and electrodes rather than by burning it for heat. Invented way back in 1838 research has been extensive – as so often in history the military have lead the way in search of mobile battlefield power sources, and NASA has used them in spacecraft since the 1960s. 

Metal hydride storage has been around less long and remains more experimental. Hydrogen gas gets combined with a metal to form a liquid or a solid powder that requires neither high pressure nor cryogenic low temperatures to store, and the gas can be recovered by heating. Promising candidate metals include lithium, sodium, magnesium and aluminium, while boron combines with ammonia that can be turned back into hydrogen via a catalyst (the French company McPhy Energy is working on the first commercial product using magnesium). The Metal Hydride Fuel Cell combines these twin technologies into the same container, which gets recharged with hydrogen gas and outputs electricity directly: these are more experimental still, and once again the chief developers are currently the military.

It’s not yet clear what would be the best overall architecture for such a hybrid hydrogen economy: build a fuel cell into the electric car with exchangeable hydride packs; electric car with exchangeable metal-hydride-fuel-cell packs; existing EV with a fuel cell to recharge its conventional battery via hydride packs; even cars with hydrogen internal combustion engines and hydride packs. All sorts of questions over energy densities, performance and safety remain to be answered. 

But the history of the automobile tells us that eventually one architecture would triumph, witness the world-wide network of petrol stations. It’s also likely that an intermediate network of hydrogen liquid or gas to regional hydride recycling centres would need to evolve. What’s certain is that governments ought to be considering advice from people fluent in these technologies, then stump up the research funds that might just save both our current way of life and eventually the planet itself.  

[Dick Pountain has a picture of the Hindenburg disaster on his spare bedroom wall] 


 




      

PC ARCHAEOLOGY

Dick Pountain /Idealog 332/ 03 Mar 2022 09:56

Regular readers of this column will know that I abandoned Windows for a Chromebook back in 2017, and have been extremely happy with my choice ever since (doctors now believe that every Windows upgrade you skip adds five years to your life). However as a columnist for a magazine that has “PC” in its title I can hardly turn my back on Microsoft completely, so I kept my last Windows machine, a Lenovo Yoga touch-screen laptop running Windows 8.1. This still runs my Canon printer and acts as a backup server via a large external Samsung USB drive, accessible from the Chromebook over wi-fi SMB. I once needed it to run a few programs, like Photoshop Elements, but all of those I’ve steadily displaced with Chromebook equivalents. 

But the Lenovo started to show its age, disk accesses got ever slower in that sinister way that suggests excessive error-correction, plus sporadic tussles with an infuriating Windows 8 rogue task called the Runtime Broker that consumes 100% CPU. Eventually it became too much of a chore to boot it up, but I was extremely disinclined to waste good money on a new machine running Windows 11 for the minimal purposes I required. Then I remembered that I still had my last-but-one laptop, a Sony Viao, sleeping peacefully in a closet. It’s the computer I used for half those 13 years I spent in Italy, during which it served without a hitch. It’s not a machine I would ever have bought for myself but was a Christmas present from our late chairman Felix Dennis. An ultra-slim Viao TZ21 with a carbon-fibre case and a very vivid 11" diagonal, 1,366x768 resolution screen, I shrink from imagining what it must have cost him. It’s far, far too good to throw away, I know, I’ll resurrect it! 

After eight dormant years in that cupboard, on plugging in and charging, the Viao booted up straight away, and then the fun began. For starters it’s 32-, not 64-bit, still runs Windows 7 Professional, and its wi-fi adapter is 14 years behind the times. How to make it rejoin the modern world? I became suddenly aware just how far in The Cloud I now live, because when I went into Windows Network Manager, though it could see my BT hub it wouldn’t connect. In Italy our remote valley lay beyond Telecom Italia’s ADSL network, so I used the Viao via a mobile phone connection from a newly built mast on the nearby mountain top. And of course their proprietary software wouldn’t work with a UK SIM even had I wanted it to. So how to get online?

A quick check in System Config showed there were indeed two network adapters in the Vaio, an Intel Wireless 3945ABG and a Marvell Yukon 88E8055 PCI-E Gigabit Ethernet, but I no longer have an Ethernet cable and the Intel clearly didn’t like my BT Hub 6. I suppose a grown-up would have started trying to diagnose the wi-fi problem, but I passionately loathe fiddling with networks or comms, and so I took the coward’s way out by fleeing to Amazon on the Chromebook. For an eye-watering £5.19 I ordered a TP-Link 150 Mbps Wireless Nano USB Wi-Fi Dongle, and when the thumbnail-sized widget arrived I popped it in a spare USB port. Like some parasitic insect it injected its drivers, flashed its little green light and connected immediately. Ookla showed a 30mps download speed, less than half what my Chromebook gets but twice what that costive Lenovo was getting 

Now for a browser. I’d been using Opera in Italy but even after updating it was deeply unhappy about certificates and stuff. Given that my life belongs to Google I had to get Chrome going. It took forever to download, let alone install, and it too was deeply unhappy, bitching about all kind of security problems and repeatedly making me identify myself. I got it sort of going but it was tragically slow and comically cranky, refusing to connect to The Guardian as a security risk (perhaps it’s joined the EDL?) So I did what you have to and reinstalled Firefox. 

After a couple of updates Firefox runs like the clappers, opens everything without demur and is not too proud to suck in all my Chrome settings and bookmarks: it works flawlessly with Gmail, Google calendar and contacts and so has effectively turned the Viao into a slower Chromebook utility-wise (though of course it can’t run Android apps). That’s all I need it to be, and for £5.19. And rather to my surprise, I understood that Windows 7 was the last version I actually liked: that Windows Button and left-side menu…

[Dick Pountain is rather enjoying retro computing]

    



 






IT NUMEROLOGY

Dick Pountain /Idealog 331/ 06 Feb 2022 03:44


It’s become a centrepiece of IT-biz wisdom that one should always, if possible, wait for version 2.0 of any new technology. Some of us who write about IT of course cannot wait and must suffer at the bleeding-edge, and once you get beyond version 2 though all bets are off (as an ex-Windows user I refuse to participate in 10 versus 11 chatter). However in recent weeks two other small integers have been brought to my attention, namely Web3 and 6G communications, and these seem to portend a forking of the IT evolutionary tree into two very different directions Web3 came properly to my attention thanks to three articles in the February issue of The Atlantic magazine, about the growing backlash against cryptocurrencies and the financialisation of digital content via NFTs. I’m slightly embarrassed that I haven’t covered crypto in this column before, but it’s no accident: ever since Bitcoin was first launched it has struck me as a dangerous, possibly fraudulent social experiment, based on ultimately unsupportable technology, but I’ve shied away from the torrent of abuse that this opinion is likely to provoke. I’m not an anarchist, but I do agree basically with the late economic anthropologist David Graeber about the origins and function of money. It’s fundamentally a way of fixing and storing the intangible emotion of trust, and the idea that blockchain technology somehow makes money ‘more democratic’ is nonsense. All it does is make it more unstable, the last thing we need in our post-viral anxiety state.

The ‘3’ in the name Web3 – coined by Gavin Wood, co-founder of the Bitcoin competitor Ethereum in 2014 – implies that Web2, dominated by Google, Facebook, Amazon will soon be over, replaced by people transacting directly with each other using blockchains. Let’s briefly recap the full genealogy. Web1, the original Berners-Lee/Andreessen invention, was a noncommercial, distributed publishing system that let scientists and nerdy hobbyists communicate with one another for free, and it spawned many services we now take for granted like webmail, blogs, memes and videos. By the mid-1990s businesses discovered its power and moved themselves online, while service providers like Google, Amazon and Facebook realised they could stay free to end-users by selling said users’ data to advertisers. That was Web2, which reaped fortunes the like of which the world never saw before.

Web3 and NFTs represent the next step, total financialisation of digital assets. Any old data, like monkeys-in-daft-hats, become speculative instruments. This potentially affects everything because nowadays computers are in everything so almost anything can become a digital asset, even your doorbell videos. Those Atlantic articles describe the furious backlash building in the USA, that calls Web3 and crypto scams, Ponzi schemes, parasitism and compares it to the 2007 sub-prime mortgage disaster.

It seems unlikely the monster corporations will be replaced soon, especially Google which funds much deep, world-leading research. Which leads me neatly on to number ‘6’. The same week I received an email from the University of Oulu in Finland with links to seven videos that demonstrate their cutting-edge research in 6G communications. I clicked, I saw, I boggled (see for yourself at

https://www.oulu.fi/6gflagship/6g_demo_series).

These 6G innovations depend on very short wave, terahertz, radio signals, and on ways the Oulu teams have found to focus and steer them. The 6G radio demo shows how such signals can communicate and remote sense – detecting the shape and surface textures of nearby objects – simultaneously from the same device like a smartphone. The 6G optics demo shows communication via THz modulation of the lighting in a room, thwarting any radio-frequency snoopers outside the room. The 6G edge computing demo shows a scanner generating detailed 3D representations of objects, including you, almost instantly, using an array of Raspberry Pi’s as ‘edge processors’. Most fun of the videos is the 6G vertical demo about lighter-than-air drones that can patrol a building in sci-fi fashion to check for defects or take drinks orders. My conclusion after watching all of them was that Zuckberg backed the wrong horse with Meta: virtual reality is actually rather naff Web1.

The world’s politicians are as always several decades behind the implications of all these innovations, but very soon some crunch-time decisions will need to be made about the deployment and regulation of them. Most state banks are looking at introducing digital currencies (not blockchained crypto) which could offer huge simplifications of their tax, spend and welfare systems. But they could also be abused, as they already are being in India and China, to penalise and control political dissent. We badly need to start serious public debates about where all this stuff is taking us, whether the 6G/Web3 future will be a eu- or a dys- topia.











GRANDPA’S AXE

 

Dick Pountain /Idealog 330/ 07 Jan 2022 02:10

Grandpa’s Axe is a figment of American folk mythology: “had it nigh on 60 years, never give me no trouble exceptin’ fer two new heads and three new handles”. Well my hi-fi system is very much like that. I’m very far from being a hi-fi buff but I do listen to a lot of music and like reasonably good sound. I bought an ex-review Sansui system back in the days when Dennis published Hi-Fi Choice, but the only component remaining from that is a pair of wood-cased Castle speakers that I love. Nowadays all my audio feeds – Smart TV, Spotify and YouTube on Chromebook, Sony CD player and Dunlop vinyl turntable – play through them. However just before Christmas my latest amplifier, a Denon which also acted as the hub, expired with a horrid death rattle. Not being up to diagnosing and repairing it, I went online to look for a replacement.

It appears ‘separates’ hi-fi amps are a dying breed and accordingly are subject to ‘Veblen Pricing’ as rich folk’s toys, but I became intrigued by the new breed of tiny power amps meant for bookshelf systems, and found myself buying a Fosi bt20a. When I first unpacked it I thought it was a joke, not much bigger than a packet of cigarettes, but then I plugged it in and reeled in amazement as it drove my massive speakers just as loud, and with superior, more open sound quality than the Denon…

Now since this isn’t an audiophile magazine (we don’t even have audio devices in the A-List) how can I steer this miracle around toward my digital brief? Well, I assumed the Fosi must be digital inside, but I was wrong – it’s even more interesting than that. It turns out to be a ‘Class-D’ device, a technology which exists in a grey area between analog and digital which I’ve covered here before, for example in the silicon retina chips designed by Carver Mead or AI chips that use analog adders to perform convolutions. Hybrid analog-digital circuits can often be faster and less power-hungry as they don’t need to analog-to-digital convert their inputs and then digital-to-analog convert the outputs. The price is loss of the precision that digital brings, but for some non-numerical applications that may be worth paying.

Class-D amplifiers are a nice case in point. They work by chopping up an analog input signal into a very high-frequency stream of square-wave pulses whose spacing represents the analog values at each tiny interval. This stream then switches twin output transistors at a similarly high frequency, then filters out frequencies beyond the audible to directly produce an amplified analog output signal. This process can be 70-90% efficient: because the two output transistors are never both on at the same time little current flows and little power is lost as heat. The Fosi uses a phone-style 24-volt power supply and barely gets warm even when playing loud. It’s still a wholly analog technology as no bits are involved, but like a digital technology it works on a stream of discrete pulses, and can therefore be supported by cheap, miniature digital chips.

The efficiency and power saving such hybrid processing confers is becoming ever more desirable to designers of the neural networks on which Deep Learning AI depends. Implementing a simulated neural network in digital technology is enormously power hungry. Training a network involves storing billions or trillions of weights in memory cells, and then performing multiply-accumulate operations on them and new inputs when the network is eventually deployed to analyse new data. Back in issue 301 I described how the US firm Mythic’s IPU chip contains an analog computing array of memory cells that are actually tunable resistors: computation happens in-place as each input voltage gets turned into an output current according to Ohm’s Law, with resistance representing the stored weight value. With data and computation stored in the same place less energy is wasted and less silicon real-estate is needed for A-to-D conversion. 

Carver Mead’s silicon retina chip similarly employed an array of photosensitive cells connected by a network of resistors, with Kirchoff’s Laws achieving the necessary ‘computation’ as currents flow between them to achieve behaviour similar to that of the human eye.

Fitted with its second new head and third new handle, my audio environment is now immensely satisfying, even if it is so hybrid that it would cause a real audiophile to grind their teeth. The little black box omnivorously devours analog feeds from TV, CD player and turntable (through an equally tiny analog mixer) just as effectively as Bluetooth from my Chromebook. And it passes my own very analog test regime: Wayne Shorter’s ‘Footprints Live’ album makes my remaining hair stand on end via either route.

[Dick Pountain thought DAC was a brand of trousers until he discovered Smirnoff]

SOCIAL UNEASE

Dick Pountain /Idealog 350/ 07 Sep 2023 10:58 Ten years ago this column might have listed a handful of online apps that assist my everyday...