Saturday, 16 January 2016

GET OVER IT

Dick Pountain/ Idealog 254/11 September 2015 11:39

If you reach my advanced age you'll discover that there are some irritants it's best to learn to live with because they're too much trouble to fix. For me two such irritants are Facebook and Microsoft Windows. What high hopes we had for Facebook when it first launched in the UK: we hoped it would replace the increasingly cranky Cix as the place where we Real Worlders could meet and exchange copy, but it hasn't worked out that way. (To be sure we do maintain a group on FB, but it's mostly confined to simple announcements and no copy gets posted there).

Facebook turned out to be less like a senior common room and more like a bustling, screeching market-square that drowns out all serious intent. It has the almost magical property of instantly turning everyone who enters into a moraliser or preener rather than an information provider: "look how well I'm doing", "I defy you not to weep over this baby dolphin/kitten/meerkat", "how dare you <blah> this <blah>", "how many <blahs> have *you* <blahed>?"). A conduit for outrage and opinion rather than fact, as you can see for yourself by contrasting the tone of FB comments with those on any proper tech forum: the Greek philosophers would have said it's all about doxa (belief) rather than episteme (knowledge).

Many's the time over the past years that my finger hovered over the "Delete account" button, but that impulse passed once I discovered how to switch people's feeds off without offending them by defriending (despite FB constantly changing the way you do it, as a deterrent). I now have friends running into three figures but see only two figures-worth of posts. And recently I realised that FB makes a great "doxometer": post some nascent column idea and see how much flak it attracts (the more the better). When I recently mentioned that my Windows 8.1 indexing service had run wild and filled up my entire 500GB hard disk, I received mostly Harry Enfield style "that's not how you do it" point-scoring (having already fixed the problem using real advice gleaned from tech forums). Ditto when I posted, ironically, that what I'm hearing about the Windows 10 upgrade process is turning me into an IT "anti-vaxer". And so on to the second irritant I've learned to live with, Windows 8.1.

To look at my desktop now you'd never even guess I'm running it. The tiles are gone along with all those hooky apps. My desktop is plastered with (highly-deprecated) icons, some pointing to folders full of vital utilities, while the tools I use most are all on the taskbar, Mac style. Neither you nor I would ever know this isn't Windows 7, and it works well enough to forget about (until a minor hiccup like that full disk). Automatic updates are turned off and I pick which ones to install manually from time to time, so haven't yet had 10 stuffed onto me. Will I eventually upgrade to 10? Haven't decided. Anti-vaxer jokes aside, I worry my Lenovo is old enough (2013) to be in the danger-zone for driver SNAFUs, but also a recent article on The Register (http://www.theregister.co.uk/2015/07/31/rising_and_ongoing_cost_of_windows/) makes me wonder whether Windows 10 is intended to tie us into an Adobe-style monthly subscription, software-as-service model whereby I lose control over future upgrades.

If that does prove to be the case I'll definitely defect, not to a Mac as so often recommended by kind friends on Facebook, but to some variety of Linux. You see, I've also come to understand that I actually *enjoy* wrestling with operating systems: it's a far more fun way to keep my mental muscles exercised than solving word puzzles on a Nintendo Gameboy, in a Pringle cardigan, on the sofa. I don't object to paying for software per se - I paid for Windows 8.1 in the original cost of my Lenovo - but what I do oppose is the ongoing campaign by big software vendors to extend their monopoly status by extracting a rental, rather than sale, price from their customers. This tendency toward rent-seeking runs a counter to an opposite tendency of networked digital technologies to make software ever cheaper, even free, and thereby reduce profits (which are needed to pay for R&D, not only to distribute to shareholders). We're getting into quite profound questions here, recently the subject of Paul Mason's intriguing book "Postcapitalism" which I'm currently reading. Mason believes, as do I, that the fact that digital products can be copied effectively for free tends to undermine the ability to set rational prices which lies at the heart of current market economics. But that, illuminated by the madness that is MadBid, are a subject for next month's column...


TEN COMMANDMENTS OF SAFETY

Dick Pountain/Idealog 253/06 August 2015 14:58

Perhaps you were as disturbed as I was by that report, back in May, that a US hacker travelling on a Boeing airliner claimed to have penetrate its flight control system via the entertainment system's Wi-Fi, and made the plane climb and turn from his laptop. Aviation experts have since rubbished his claim (but then Mandy Rice Davies would definitely apply). It did however concentrate my mind, in a most unwelcome fashion, on the fact that all the planes we fly in nowadays employ fly-by-wire under software-control, and that my confidence in software engineers falls some way short of my confidence in mechanical engineers.

This nagging anxiety was rubbed in further by the fatal crash of a military Airbus A400M in July, after its engine control software shut down three of its four engines just after take-off. It appears that accidental erasure of some config files during installation had deprived the software of certain torque calibration parameters that it needed to monitor engine power. These (literally) vital numbers were being loaded from an external file, so in other words the safety of this aircraft was being governed by a programming practice on a par with installing Windows updates. Nice.

To me safety-critical software is about more than just fear for my own ass: it's been of concern for many years. I started out as a Forth programmer, at a time when that language was widely used in embedded control systems, and attended conferences on the subject of safety in both software and hardware architecture. Then I graduated, via Pascal and Modula-2, to becoming an admirer of Nikolaus Wirth's ideas on good programming practice, and finally on to object-oriented programming as the way to wrest control over the structure of really large programs. Object-orientation is now of course the rule, supported by every language from Javascript to Scratch, but I sometimes wonder whether it still means anything very much, or has it become a mere style to which lip-service is paid. Loading critical data from unreliable external files violates the principles of encapsulation in more ways than I can count.

I did a bit of Googling and found lots of papers about safety-critical architectures and redundant hardware systems. Redundancy is a key safety concept: you build three separate computers, with CPUs from different manufacturers running different software written by different teams - which have been demonstrated to produce the same outputs from the same inputs - then go with the majority verdict, the idea being that the *same* software or hardware bug is very, very unlikely to arise in all three. Interestingly enough, the latest of these papers seemed to be dated around 2008.

Surely it can't be that, as in so many other spheres (like, say, banking) the optimists have taken over the farm and started trusting too much? Then I stumbled across NASA's 10 rules for developing safety critical code. Now NASA tends to work with computer hardware that's several decades behind state-of-the-art but - give or take a Hubble or two - it's had fairly few disasters that were down to software. Here are its rules, severely abbreviated by me:

1: All code to have simple control flow constructs: no GOTO, direct or indirect recursion.
2: All loops to have a fixed upper bound, which can be statically proved never to be exceeded.
3: No dynamic memory allocation after initialization.
4: No function longer than 60 lines of code.
5: The assertion density of the code to average a minimum of two assertions per function.
6: Data objects must be declared at the smallest possible level of scope.
7: Calling functions must check non-void function return values and the validity of all parameters.
8: Preprocessor to be restricted to headers and simple macros. Token pasting, variable argument lists (ellipses) and recursive macro calls all forbidden.
9: Pointers to be restricted to one level of dereferencing, which mustn't be hidden inside macros or typedefs. Function pointers forbidden.
10: All code must be compiled with all warnings enabled at their most pedantic setting. All code must compile at these settings without any warnings. All code to be checked daily with at least one — preferably several — state-of-the-art static source code analyzer, and pass with zero warnings.

These rules are, realistically enough, aimed at plain old C programmers, not at trendy new languages, but they impose a degree of rigour comparable to most object-oriented languages. Their recommended heavy use of Assertions is interesting. Assertions are supported directly in Eiffel, Ada and some other languages, and can be added to C via the header "assert.h". They can specify the desired value range of some variable at some point in program execution and raise a runtime error when not met: an example might be "assert( TorqueCalibrationParameter > 0)".

Sunday, 13 December 2015

LOSING THE PLOT?

Dick Pountain/Idealog 252/27 June 2015 16:27

Eagle-eyed readers may have noticed that I haven't mentioned my Nexus 7 tablet in recent months, which is because, until a couple of days ago it languished in a drawer, fatally wounded by the Android 5.0 Lollipop update. (Google should have broken with its confectionery-oriented naming convention and named it "GutShot"). Lollipop rendered it so slow as to be unusable - five minutes or more to display the home screen - and even after scores of reboots, cache clearances and a factory reset, it remained not just slow but it appeared its battery had died too, taking a day to recharge and barely an hour to discharge again.

I did investigate downgrading back to 4.4 KitKat, but the procedures involved are absolutely grisly, requiring not just rooting, but downloading huge ISO image files via a PC with the ever-present chance of a failure that bricks the tablet completely: all totally unacceptable for a consumer-oriented device. (It did set me wondering how the Sale of Goods act might apply to destructive OTA upgrades that aren't reversible by normal human beings...) Instead I went to my local PC World and picked up an Asus Memo Pad 7 for £99, which I repopulated with all my apps and data within a morning, thanks to the brighter side of Google's Cloud, and has worked a treat ever since, and has a front camera and card-slot too. Then last week I discovered that Android 5.1.1 was now available for the Nexus and, with nothing to lose, installed it. A mere six months after its assassination my Nexus came back to life again, faster and slicker than the Asus, with its battery miraculously resurrected and lasting longer than originally.

There has to be a moral to this tale somewhere, but I'll be damned if I can identify it. Google's testing of 5.0 was clearly inadequate, and  its lethargy in keeping us victims informed and releasing a fix not far short of criminal. But stuff like this happens on the IT battlefield all the time. A bigger issue is that it destroys confidence in the whole Over-The-Air update model which I'd come to see as the way forward. If Google (or Apple, or Microsoft) wishes to mess directly with my machine, then at the very least they'll need to provide a simple, fool-proof mechanism to unwind any damage done. But that leads on to another, deeper issue: it feels to me as though all these new generation, cloud-oriented firms, are approaching some sort of crisis of manageability. The latest phones and tablets are marvels of hardware engineering, with their cameras and motion sensors and GPS and NFC and the rest, but all these services have to be driven from and integrated into operating system kernels that date back to the 1980s, using programming languages that fall some way short of state-of-the-art. The result is a spectacular cock-up like Lollipop, or those minor memory-leaks that cause your iPad to gradually clag up until you have to reboot it.

It is of course inconceivable to start from scratch at this point in history, but I was reminded last week of what might have been when I exchanged emails, after twenty years, with Cuno Pfister, a Swiss software engineer I knew back in Byte days who used to work on Oberon/F with Niklaus Wirth in Zurich. Oberon was Wirth's successor to Modula-2, the culmination of his programming vision, and Oberon/F was a cross-platform, object-oriented framework with the language compiler at its heart, complete with garbage collection to combat memory leakage, preconditions to assist debugging, and support for a Model-View-Controller architecture. Its basic philosophy was that an operating system should become a software layer that "imports hardware and exports applications". New hardware drivers and applications were written as insulated modules, usually by extending some existing module, and they clicked into place like Lego bricks. Strong modularity and strong typing enabled 90% of errors to be caught at compile time, while garbage collection and preconditions simplified debugging the rest. It was precisely the sort of system we need to program today's tablets, but of course it could make no headway at all against the sheer inertia of Unix and C++.

What I miss most about that concept is having the programming language compiler built right into the OS. I still occasionally get the urge to write little programs, but all the tools I have are either massive overkill like Visual Studio, or command-line austerity like Ruby, and the APIs you have to learn are hideous too. I did recently discover a quite usable Android JavaScript tool called DroidScript, and the first thing I wrote in it, as is my historical habit, was a button that when pressed says "bollox"...  

Monday, 16 November 2015

BITS STILL AIN'T ATOMS

Dick Pountain/Idealog 251/07 June 2015 13:48

I'd started to write that I'm as fond of gadgets as the next man, but in truth I'm only as fond as the one after the one after him (which is still fairly fond). For example I get enormous pleasure from my recently-acquired Zoom G1on guitar effects pedal, frightening the neighbours with my PhaseFunk twangs. However I've resisted the hottest of today's gadgets, the 3D printer, with relative ease. Partly it's because I have no pressing need for one: being neither a vendor of cornflakes nor a devotee of fantasy games or toy soldiers I just don't need that many small plastic objects. I can see their utility for making spare parts for veteran mechanical devices, but I don't do that either. What deters me more though is the quasi-religious atmosphere that has enveloped 3D printing, as typified by those reverential terms "making" and "maker". People desperately want to bridge the gap between digital representation and real world, between CGI fantasy and life, and they've decided 3D printing is a step on the way, but if so it's a tiny step toward a very short bridge that ends in mid-air.

One problem is precisely that 3D printing tries to turn bits into atoms, but pictures don't contain the internal complexity of reality. Serious applications of 3D printing are, for example, the aerospace industry where components can be printed in sintered metal quicker, more cheaply and of greater geometric complexity than by traditional forging or casting techniques. Even so two things remain true: such parts are typically homogeneous (all the same metal) and made in relatively small quantities since 3D printing is slow - if you need 100,000 of something then 3D print one and make a mold from it for conventional casting. Printing things with internal structure of different materials is becoming possible, but remains topologically constrained to monolithic structures.  

That's the second problem, that 3D printing encourages thinking about objects as monolithic rather than modular. Modularity is a profound property of the world, in which almost every real object is composed from smaller independent units. In my Penguin Dictionary of Computing I said: "modules must be independent so that they can be constructed separately, and more simply than the whole. For instance it is much easier to make a brick than a house, and many different kinds of house can be made from standard bricks, but this would cease to be true if the bricks depended upon one another like the pieces of a jigsaw puzzle." The basic module in 3D printing is a one-bit blob firmly attached to the growing object.

I recently watched a YouTube video about a project to 3D print mud houses for developing countries, and it was undeniably fascinating to watch the print head deposit mud (slowly) in complex curves like a wasp building its nest. But it struck me that, given the computing power attached to that printer, it would be faster to design a complex-curved brick mold, print some and then fill them with mud and assemble the houses manually.

The ultimate example of modularity, as I never tire of saying, is the living cell, which has a property that's completely missing from all man-made systems: every single cell contains not only blueprints and stored procedures for building the whole organism, but also the complete mechanism for reproducing itself. This mind-boggling degree of modularity is what permitted evolution to operate, by accidentally modifying the blueprints, and which has lead to the enormous diversity of living beings. No artificial "maker" system can possibly approach this status so long as fabrication remains homogeneous and monolithic, and once you do introduce heterogeneous materials and internal structure you'll start to confront insuperable bandwidth barriers as an exponentially-exploding amount of information must be introduced from outside the system rather than being stored locally. A machine that can make a copy of itself seems to really impress the maker community, but you just end up with a copy of that machine. A machine that copies itself, then makes either an aeroplane, or a bulldozer, or a coffee machine out of those copies is some way further down the road.

I was lead to these thoughts recently while watching Alex Garland's excellent movie Ex Machina. In its marvellous denouement the beautiful robot girl Ava kills her deeply unpleasant maker and escapes into the outside world to start a new, independent life, but first she has to replace her arm, damaged in the final struggle, with a spare one. Being self-repairing at that level of granularity is feeble by biological standards, and as she stood beaming at a busy city intersection it struck me that such spare parts would be in short supply at the local hospital...  

STRICT DISCIPLINARIAN

Dick Pountain/Idealog 250/05 May 2015 11:23

After photography my main antidote to computer-trauma is playing the guitar. Recently I saw Stefan Grossman play live for the first time at London's King's Place, though I've been learning ragtime picking from his books for the last 30 years. He played his acoustic Martin HJ-38 through a simple PA mike, and played it beautifully. Another idol of mine is Bill Frisell, who could hardly be more different in that he employs the whole gamut of electronic effects, on material from free jazz, through bluegrass to surf-rock. Dazzled by his sound I just purchased a Zoom G1on effects pedal from Amazon, and am currently immersed in learning how to deploy its 100 amazing effects.

The theme I'm driving at here is the relationship between skill, discipline and computer-assistance. There will always of course be neo-Luddites who see the computer as the devil's work that destroys all skills, up against pseudo-modernists who believe that applying a  computer to any banal material will make it into art. Computers are labour-savers: they can be programmed to relieve humans of certain repetitive tasks and thereby reduce their workload. But what happens when that repetitive task is practising to acquire a skill like painting or playing a musical instrument?

The synth is a good example. When I was a kid learning to play the piano took years, via a sequence of staged certificates, but now you can buy a keyboard that lets you play complex chords and sequences after merely perusing the manual. Similarly if you can't sing in tune a not-that-inexpensive Auto-Tune box will fudge that for you. Such innovations have transformed popular music, and broadened access to performing it, over recent decades. Does that make it all rubbish? Not really, it's only around 80% rubbish, like every other artform. The 20% that isn't rubbish is made by people who still insist on discovering all the possibilities and extending their depth, whether that's in jazz, hiphop, r&b, dance or whatever.

Similar conflicts are visible with regard to computer programming itself. I've always maintained that truly *great* programming is an art, structurally not that unlike musical composition, but the vast majority of the world's software can't be produced by great programmers. One of my programming heroes, Prof Tony Hoare, has spent much of his career advocating that programming should become a chartered profession, like accountancy, in the interests of public safety since so much software is now mission-critical. What we got instead is the "coding" movement which encourages absolutely everybody to start writing apps using web-based frameworks: my favourite Guardian headline last month was "Supermodels join drive for women to embrace coding". Of course it's a fine idea to improve everyone's understanding of computers and help them make their own software, but such a populist approach doesn't teach the really difficult disciplines involved in creating safe software: it's more like assembling Ikea furniture, and if that table-leg has an internal flaw your table's going to fall over.

Most important of all though, there's a political-economic aspect to all this. Throughout most of history, up until the last century, spending years acquiring a skill like blacksmithing, barbering, medicine, singing, portrait painting might lead to some sort of a living income, since people without that skill would pay you to perform it for them. Computerised deskilling now threatens that income stream in many different fields. Just to judge from my own friends, the remuneration of graphic designers, illustrators, photographers and animators has taken a terrible battering in recent years, due to digital devices that opened up their field and flooded it with, mostly mediocre, free content. The arguments between some musicians and Spotify revolves around a related issue, not of free content but of the way massively simplified distribution reduces the rates paid.

We end up crashing into a profound contradiction in the utilitarian philosophy that underlies all our rich Western consumer societies, which profess to seek the greatest good for the greatest number: does giving more and more people ever cheaper, even free, artefacts trump the requirement to pay those who produce such artefacts a decent living? I think any sensible solution probably revolves around that word "decent": what exactly constitutes a decent living, and who or what decides it? Those rock stars who rail against Spotify aren't sore because their children are starving, but because of some diminution in what most would regard as plutocratic mega-incomes. Some people will suggest that it's market forces that sort out such problems (and of course that's exactly what Spotify is doing). I've no idea what Stefan Grossman or Bill Frisell earn per annum, but I don't begrudge them a single dollar of it and I doubt that I'm posing much of threat to either of them  (yet).

Wednesday, 16 September 2015

MENTAL GYMNASTICS

Dick Pountain/Idealog 249/09 April 2015 21:04

Recently the medical profession has discovered that stimulating our brains with difficult puzzles and problems - mental exercise in other words - has a beneficial effect on health.Such exercises can't, as some have suggested, actually cure dementia but there's evidence it may delay its onset. Just a few years ago one of the big electronic vendors advertised its hand-held games console by showing ecstatic grey-beards using it to play Connect Four on the sofa with their grandsprogs. As a senior citizen myself I must feign interest in such matters, but in truth I've never really worried that I'm not getting enough mental exercise, because the sheer cack-handedness that prevails in the IT business supplies all the exercise I can use, for free, every single month.

Take for example the impact of new security measures on the attempt to keep a working website. Plagued by hacks, leaks, LulzSec, Heartbleed, NSA surveillance, every online vendor is tightening security, but they're not all that good at notifying you how. I've had a personal website since 1998, and more recently I've also been running three blogs while maintaining an archive of the works of a dear friend who died a couple of years ago. I've never believed in spending too much money on these ventures, so I hosted my very first attempt at a website on Geocities, and built it using a free copy of NetObjects Fusion from a PC Pro cover disk. Mine was thus one of the 38,000,000 sites orphaned when Yahoo closed Geocities in 2009, so I moved to Google Sites and built a new one from scratch using Google's online tools. Around this time I also shelled out money to buy my own domain name dickpountain.co.uk from the US-based registrar Freeparking.

My low budget set-up has worked perfectly satisfactorily, without hiccup, for the last six years, that is until January of this year when I suddenly found that www.dickpountain.co.uk no longer accessed my site. To be more exact, it said it had accessed it but no pages actually appeared. I checked that the site was working via its direct Google Sites address of https://sites.google.com/site/dickpountainspages/ and it was, so perhaps redirection from my domain had stopped working properly? To check that, I went to log into my account at Freeparking's UK site, only to find that entering my credentials merely evoked the message "A secure connection cannot be established because this site uses an unsupported protocol. Error code: ERR_SSL_VERSION_OR_CIPHER_MISMATCH?"

Locked out of my account I couldn't reach Freeparking support, so I mailed RWC columnist Paul Ockenden who immediately asked whether I was using Chrome. Yes I was. Did I know that nowadays it disables SSL3 by default? No I didn't, thank you Paul. With SSL3 enabled I managed to get into my account, only to find nothing had changed: it was still set to redirect dickpountain.co.uk to that Google address. About then I received another email from Paul saying a source view on http://www.dickpountain.co.uk/ shows the frameset there but not visible: Google was suddenly refusing to display pages in an external frameset, problem not with Freeparking.

An evening plodding through the forums revealed that Google too has upped security, and now you have to verify ownership of your site (even one that's been running for six years already). Their help page explaining verification offers four different methods: add a meta tag to your home page to prove you have access to the source files; add the Google Analytics code you use to track your site; upload an HTML file to your server; or verify via your domain name provider by adding a new DNS record. None of the first three worked because my site was built in Google's own tools, which strip out any added meta tags and won't allow uploading raw HTML. I needed to make a trek into into the belly of the beast, into Mordor, into... DNS. Now DNS scares me the way the phone company scared Lenny Bruce ("mess with it and you'll wind up using two dixie cups and a string"). Log into Freeparking site, go to ominously-named "Original DNS Manager Interface (advanced users)" and edit a CNAME record to point, as instructed by Google Help to ghs.google.com. Nothing happens. Try again, five times, before it finally sticks. Half an hour later www.dickpountain.co.uk works again!

You might expect me to be annoyed by such a nerve-wracking and unnecessary experience, but you must be kidding, I was jubilant. I still have it! The buggers didn't grind me down! It's like climbing Everest while checkmating Kasparov! Macho computing, bring it on,  mental whoops and high-fives. It did wear off after a couple of days, but I still smirk a little every time I log onto my site now...

Wednesday, 12 August 2015

SYNCING FEELING

Dick Pountain/Idealog 248 /05 March 2015 15:29

Astute readers may have noticed that I'm deeply interested in (a nicer way of saying obsessed by) note-taking. This is no coincidence, because all my main occupations - editing the Real World section, writing this column, writing and reviewing books - involve reading the work of others and gathering together important points. Anything that reduces the amount of re-typing required is a bit like the difference between weeding a field of beans using a tea-spoon and using a tractor. Just few years ago my desk groaned under thick hard-back books that bristled like porcupines with small yellow Post-It notes to mark those pages I needed quotes from, or had pencilled margin notes on.

Making notes on a tablet that could sync with my laptop removed the need for those yellow flags, but still left me the job of re-typing the quotes into my own text. (Over the years I'd tried several of those pen-like or roller-like handheld scanners, but none was effective enough to be worth the hassle). No, the logical final step is for the source material I'm reading to be online too, but it's taken until now to arrive there. For the very first time I'm reviewing a book in its Kindle rather than paper edition, which means I can search its full text for relevant phrases and cut-and-paste all the resulting notes and quotes. In theory that is, because it turns out not be quite so simple.

Amazon's Kindle reader software certainly enables you to place bookmarks, highlight passages and make notes, but none of these functions is without its quirks, and the way they work varies between versions. I like to use an actual hardware Kindle when outdoors because it's light, readable in sunlight and has great battery life. Indoors I prefer to read and note-take on my Android tablet, but I write the actual review on my Windows laptop, and all these platforms run different versions of the reader.

First quirk is that the granularity of Kindle bookmarks is too broad, working only to whole page boundaries. When I view Notes & Marks the short extract presented is only the top few lines of that *page*, though my interest might lie further down. Highlights are more useful because then the extract is from the start of the highlighted area, not the whole page. I can attach a note to any single word on a page, but in Notes & Marks only its text appears, so I end up with a cryptic list like "yes", "no", "good", "really?" with no idea what each refers to until I click it and go to that page, which becomes dizzying after a while. The best compromise is to highlight a sentence or paragraph and then attach a note to its first word.

Next quirk: notes, highlights and bookmarks should sync automatically between Kindle, tablet and desktop readers, but notes made on my tablet weren't showing up on the laptop. This matters because I can only cut and paste highlighted quotes from the laptop version, as  Kindle and tablet versions have no copy function. Solving this required a stiff yomp through the forums, where sure enough I found an answer - you have to manually sync by hitting that little curly-arrows icon. Still didn't work. More forums and the real answer. You have to hit *not* the sync icon inside the book in question, but the one on the home screen with all books closed. Doh! But it does work.

The last quirk is that you can't run multiple instances of Kindle reader on the same device. It so happens I have another book on my Kindle that's relevant to this review and I'd like to quote from it: have to go out into the library, open other book, find quote, cut-and-paste (but  on laptop version only). It would be nice to keep two books open in two instances of Kindle reader on same machine. I really shouldn't grouse too much though, because merely being able to search, make notes and cut-and-paste them has hugely reduced the amount of tedious re-typing involved in the reviewing process, and I also need to remember that Amazon is obliged by copyright and fair-usage to restrict some of these functions (a copyright notice gets placed on every quote I paste, which I delete).

Nevertheless I do believe that Amazon is missing a trick here, and that just making a few fairly minor tweaks would establish a really effective collaborative network for students and researchers to share notes and quotes, which wouldn't need to carry advertising since the product has already been paid for. That would of course grant Amazon the sort of dominance that the US courts have already refused to Google, but let's not go there...
 

POD PEOPLE

Dick Pountain /Idealog 366/ 05 Jan 2025 03:05 It’s January, when columnists feel obliged to reflect on the past year and who am I to refuse,...