Tuesday, 7 August 2012

THE SLOPES OF MOUNT NIKON

Dick Pountain/PC Pro/Idealog 209 - 12/12/2011

It shouldn't come as any huge surprise to regular readers if I admit being agnostic with respect to the newly-founded cult of Jobsianism. Agnostic rather than atheist, because I did greatly admire some of Steve Job's qualities, most particularly his taste for great design and his uncanny skill at divining what people would want if they saw it, rather than what they want right now. On the other hand I found Jobs' penchant for proprietory standards, monopoly pricing, patent trolling and "walled-garden" paternalism deeply repugnant - and to judge from Walter Isaacson's new authorised biography I wouldn't have much cared to drink beer or drop acid with him. In my own secular theology Jobs will now be occupying a plateau somewhere on the lower slopes of Mount Nikon (which used to be called Mount Olympus before the Gods dumped the franchise in protest at that accounting scandal) alongside other purveyors of beautiful implements like Leo Fender and Enzo Ferrari.

So which figures from the history of computing would I place higher up the slopes of Mt Nikon? Certainly Dennis Ritchie (father of the C language) and John McCarthy (father of Lisp) both of whom died within a week or so of Jobs and whose work helped lead programming out of its early primitivism - there they could resume old arguments with John Backus (father of Fortran). But on far higher ledge, pretty close to the summit, would be the extraordinarily talented physicist Richard Feynman, whom not very many people would associate with computing at all. I've just finished reading his 1985 book, which I had somehow overlooked, called "QED: The Strange Theory of Light and Matter", a collection of four public lectures he gave in New Zealand and California explaining quantum electrodynamics for a popular audience. The book amply demonstrates Feynman's brilliance as a teacher who applied sly humour and inspired metaphors to explain the most difficult of subject, er, matter. He cleverly imagines a tiny stopwatch whose second hand represents the phase of a travelling photon, and through this simple device explains all the phenomena of optics, from reflection and refraction to the two-slit quantum experiment, more clearly than I've ever seen before. But what does this have to do with computers?

Well, having finished his book in a warm glow of new-found understanding, I was prompted to take down a book I've mentioned here before, by Carver Mead the father of VLSI (very large scale integrated circuit) technology and an ex-student of Feynman's at Caltech. Mead's "Collective Electrodynamics" defends the wave view of sub-atomic physics preferred by Einstein but rejected by Niels Bohr (and the majority of contemporary physicists), using ideas about photon absorption taken directly from Feynman. But, once again, what does this have to do with computers? Quite a lot actually. In his introduction to Collective Electrodynamics Mead offers an anecdote from the early 1960s which includes these remarks:

"My work on [electron] tunnelling was being recognized, and Gordon Moore (then at Fairchild), asked me whether tunnelling would be a major limitation on how small we could make transistors in an integrated circuit. That question took me on a detour that was to last nearly 30 years, but it also lead me into another collaboration with Feynman, this time on the subject of computation." Mead presented their preliminary results in a 1968 lecture and "As I prepared for this event, I began to have serious doubts about my sanity. My calculations were telling me that, contrary to all the current lore in the field, we could scale down the technology such that *everything got better*" In fact by 1971 Mead and Feynman were predicting Moore's Law, from considerations of basic quantum physics.

Now utopian predictions about the potential power of quantum computers are the flavour of this decade, but it's less widely appreciated that our humble PCs already depend upon quantum physics: QED, and its sister discipline QCD (quantum chromodynamics), underly all of physics, all of chemistry, and actually all of everything. The band gap of silicon that makes it a semiconductor and enables chips to work is already a quantum phenomenon. The first three of Feynman's lectures in "QED" are mostly about photons, but his last chapter touches upon "the rest of physics", including Pauli's Exclusion Principle. Electrons are such touchy creatures that at most two of opposite spins can ever occupy the same state, a seemingly abstract principle which determines the ways that atoms can combine, that's to say, all of chemistry, cosmology, even biology. It's why stone is hard and water is wet. Stars, planets, slugs, snails, puppy dog's tails, all here thanks to the Exclusion Principle, which is therefore as good a candidate as any for the bounteous creative principle in my little secular theology. Its dark sworn enemy is of course the 2nd Law of Thermodynamics: in the end entropy or chaos must always prevail.

It seems I've reinvented a polytheistic, materialistic version of Zoroastrianism, a Persian religion from around 600BC. At the centre of my theology stands cloud-capped Mount Nikon, its slopes teeming with great minds who advanced human understanding like Aristotle, Spinoza and Nietschze, with ingenious scientists like Einstein and Feynman, and lower down with talented crazies who gave us beautiful toys, like Steven Jobs.

Tuesday, 3 July 2012

VOICE OF MULTITUDES

Dick Pountain/PC Pro/Idealog 208: 15/11/2011

It can't have escaped regular readers of this column that I'm deeply sceptical about several much-hyped areas of progress in IT. To pick just a couple of random examples, I've never really been very impressed by voice input technologies, and I'm extremely doubtful about the claims of so-called "strong" Artificial Intelligence, which insists that if we keep on making computers run faster and store more, then one day they'll become as smart as we are. As if that doesn't make me sound grouchy enough, I've been a solid Windows and PC user for 25 years and have never owned an Apple product. So surely I'm not remotely interested in Apple's new Siri voice system for the iPhone 4S? Wrong. On the contrary I think Siri has an extraordinary potential that goes way beyond the purpose Apple purchased it for, which was to impress peoples' friends in wine bars the world over and thus sell more iPhones. It's not easy to justify such faith at the moment, because it depends upon a single factor - the size of the iPhone's user base - but I'll have a go.

I've been messing around off and on with speech recognition systems since well before the first version of Dragon Dictate, and for many years I tried to keep up with the research papers. I could ramble on about "Hidden Markoff Models" and "power cepstrums" ad nauseam, and was quite excited, for a while, by the stuff that the ill-fated Lernhout & Hauspie was working on in the late 1990s. But I never developed any real enthusiasm for the end results: I'm somewhat taciturn by nature, so having to talk to a totally dumb computer screen was something akin to torture for me ("up, up, up, left, no left you *!£*ing moron...")

This highlights a crucial problem for all such systems, namely the *content* of speech. It's hard enough to get a computer to recognise exactly which words you're saying, but even once it has they won't mean anything to it. Voice recognition is of very little use to ordinary citizens unless it's coupled to natural language understanding, and that's an even harder problem. I've seen plenty of pure voice recognition systems that are extremely effective when given a highly restricted vocabulary of commands - such systems are almost universally employed by the military in warplane and tank cockpits nowadays, and even in some factory machinery. But asking a computer to interpret ordinary human conversations with an unrestricted vocabulary remains a very hard problem indeed.

I've also messed around with natural language systems myself for many years, working first in Turbo Pascal and later in Ruby. I built a framework that embodies Chomskian grammar rules, into which I can plug different vocabularies so that it spews out sentences that are grammatical but totally nonsensical, like god-awful poetry:

    Your son digs and smoothly extracts a gleaming head
          like a squid.
    The boy stinks like a dumb shuttle.

So to recap, in addition to first recognising which words you just said, and then parsing the grammar of your sentence, the computer comes up against a third brick wall, meaning, and meaning is the hardest problem of them all.

However there has been a significant breakthrough on the meaning front during the last year. I'm talking of course about IBM's huge PR coup in having its Watson supercomputer system win the US TV quiz show "Jeopardy" against human competitors, which I discussed here back in Idealog 200. Watson demonstrated how the availability of cheap multi-core CPUs, when combined with software like Hadoop and UIMA capable of interrogating huge distributed databases in real time, can change the rules of the game when it comes to meaning analysis. In the case of the Jeopardy project, that database consisted of all the back issues of the show plus a vast collection of general knowledge stored in the form of web pages. I've said that I'm sceptical of claims for strong AI, that we can program computers to think the way we think - we don't even understand that ourselves and computers lack our bodies and emotions which are vitally involved in the process - but I'm very impressed by a different approach to AI, namely "case based reasoning" or CBR.

This basically says don't try to think like a human, instead look at what a lot of actual humans have said and done, in the form of case studies of solved problems, and then try to extract patterns and rules that will let you solve new instances of the problem. Now to apply a CBR-style approach to understanding human every-day speech would involve collecting a vast database of such speech acts, together with some measure of what they were intended to achieve. But surely collecting such a database would be terribly expensive and time consuming? What you'd need is some sort of pocketable data terminal that zillions of people carry around with them during their daily rounds, and into which they would frequently speak in order to obtain some specific information. Since millions upon millions of these would be needed, somehow you'd have to persuade the studied population to pay for this terminal themselves, but how on earth could *that* happen? Duh.

Collecting and analysing huge amounts of speech data is a function of the Siri system, rendered possible by cloud computing and the enormous commercial success of the iPhone, and such analysis is clearly in Apple's own interest because it incrementally improves the accuracy of Siri's recognition and thus gives it a hard-to-match advantage over any rival system. The big question is, could Apple be persuaded or paid to share this goldmine of data with other researchers in order build a corpus for a more generally available natural language processing service? Perhaps once its current bout of manic patent-trolling subsides a little we might dare to ask...

[Dick Pountain doesn't feel quite so stupid talking to a smartphone as he does to a desktop PC]

OOPS THEY'VE DONE IT AGAIN

Dick Pountain/PC Pro/Idealog 207 16/10/2011

Should there be anyone out there who's been reading me since my very first PC Pro column, I'd like to apologise in advance for revisiting a topic I've covered here no less than four times before (in 1994, 1995, 1997 and 2000). That topic is how Microsoft's OS designers just don't get what an object-oriented user interface (OOUI) ought to look like, and the reason I'm covering it again here is the announcement of the Metro interface for Windows 8, which you'll find described in some detail in Simon Jones' Advanced Office column on pg xxx of this issue. It would appear that, after 17 years, they still don't quite get it, though they're getting pretty close.

A proper OOUI privileges data over applications, so that your computer ceases to be a rats-nest of programs written by people like Microsoft, Adobe and so on and becomes a store of content produced by you: documents, spreadsheets, pictures, videos, tunes, favourites, playlists, whatever. Whenever you locate and select one of these objects, it already knows the sorts of things you might want to do with it, like view it, edit it, play it, and so it invisibly launches the necessary application for you to do that. Metro brings that ability to Windows 8 in the shape of "Tiles" which you select from a tablet's touch screen with your finger, and which cause an app to be launched. The emphasis is still on the app itself (as it has to be since Microsoft intends to sell these to you from its app store), but it is possible to create "secondary" Tiles that are pinned to the desktop and launch some particular data file, site, feed or stream.

It's always been possible to do something like this with Windows, in a half-arsed kind of way, and I've been doing so for 15 years now. It's very, very crude because it's wholly dependent upon fragile and ambiguous filename associations - assign a particular application to open a particular type of file identified by name extension. Ever since Windows 95 my desktop has contained little but icons that open particular folders, and clicking on any file within one of these just opens it in Word, Textpad, Excel or whatever. I need no icons for Office applications, Adobe Reader or whatever, because I never launch them directly.

This was actually a horrid step backwards, because under Windows 3.1 I'd grown used to an add-on OOUI called WinTools that was years ahead of the game. Under WinTools desktop icons represented user data objects, which understood a load of built-in behaviours in addition to the app that opened and edited them. You could schedule them, add scripts to them, and have them talk to each other using DDE messages. It featured a huge scrolling virtual desktop, which on looking back bore an uncanny resemblance to the home screens on my Android phone. Regrettably Tool Technology Publishing, the small outfit that developed WinTools, was unable to afford to port it to Windows 95 and it disappeared, but it kept me using Windows 3.1 for two years after everyone else had moved on to 95.

That resemblance to Android is more than just coincidence because hand-held, touch-screen devices have blazed the trail toward today's object-oriented user interfaces. For most people this trend began with Apple's iPhone and iPod Touch, but to give credit where it's due PalmOS pioneered some of the more important aspects of OOUI. For example on the Palm Pilot you never needed to know about or see actual files: whenever you closed an app it automatically saved its content and resumed where you left off next time, a feature now taken for granted as absolutely natural by users of iPads and other tablets.

Actually though we've barely started to tap the real potential of OOUIs, and that's why Metro/Windows 8 is quite exciting, given Microsoft's expertise in development tools. Processing your data via intelligent objects implies that they should know how to talk to each other, and how to convert their own formats from one app to another without manual intervention. As Simon Jones reports, the hooks to support this are already in place in Metro through its system of "contracts": objects of different kinds that implement Search, Share, Settings and Picker interfaces can contract to find or exchange data with one another seamlessly, which opens up a friendlier route to create automated workflows.

In his Advanced Windows column last month Jon Honeyball sang the praises of Apple's OSX Automator, which enables data files to detect events like being dropped into a particular folder, and perform some action of your choice when they do so. This ability is built right into file system itself, a step beyond Windows where that would require VBA scripts embedded within Office documents (for 10 years I've had to use a utility called Macro Express to implement such inter-file automation). Now tablet-style OSes like Metro ought to make possible graphical automation interfaces: simply draw on-screen "wires" from one tile into an app, then into another, and so on to construct a whole workflow to process, for example, all the photographs loaded from a camera. Whoever cracks that in a sufficiently elegant way will make a lot of money.

FRAGILE WORLD

Dick Pountain/PC Pro/Idealog 206/ 19/09/2011

I'm writing this column in the middle of a huge thunderstorm that possibly marks the end of our smashing Indian Summer in the Morra Valley (I know, sorry). Big storms in these mountains sometimes strike a substation and knock out our mains supply for half a day, but thankfully not this time - without electric power we have no water, which comes from a well via a fully-immersed demand pump. Lightning surges don't fry all the electrical goods in the house thanks to an efficient Siemens super-fast trip-switch, but years ago, before I had broadband, I lost a Thinkpad by leaving its modem plugged in. Lightning hit phone line, big mess, black scorchmarks all the way from the junction box...

Nowadays I have an OK mobile broadband deal from TIM (Telecom Italia Mobile), €24 per month for unlimited internet (plus a bunch of voice and texts I never use), which I don't have to pay when we're not here. It's fast enough to watch BBC live video streams and listen to Spotify, but sometimes it goes "No Service" for a few hours after a bad thunderstorm, as it has tonight. That gives me a sinking feeling in my stomach. I used to get that feeling at the start of every month, until I realised the €24 must be paid on the nail to keep service (and there's no error message that tells you so, just "No Service"). Now I know and I've set up with my Italian bank to top-up via their website - but if I leave it too late I have to try and do that via dial-up. Sinking feeling again.

Of course I use websites to deal with my UK bank, transfer funds to my Italian bank, pay my income tax and my VAT, book airline tickets and on-line check in. A very significant chunk of my life now depends upon having not merely an internet connection, but a broadband internet connection. And in London just as much as in the Umbro-Cortonese. I suspect I'm not alone in being in this condition of dependency. The massive popularity of tablets has seen lots of people using them in place of PCs, but of course an iPad is not much more than a fancy table mat without a 3G or Wi-Fi connection. But then, the internet isn't going to go away is it? Well, er, hopefully not.

After the torrid year of riots, market crashes, uprisings, earthquakes and tsunamis, and near-debt-defaults we've just had, who can say with the same certainty they had 10 years ago that every service we have now will always be available? The only time I ever detected fear in ex-premier Tony Blair's eyes was on the evening news during the 2000 petrol tanker drivers' strike, when it became clear we were just a day or so from finding empty shelves at the supermarket. In Alistair Darling's recent memoirs he comes clean that in 2008 - when he and Gordon were wrestling the financial crisis precipitated by the collapse of Lehman Brothers - it was at one point possible that the world's ATM machines could all go offline the next morning. Try to imagine it. It's not that all your money has gone away (yet), just that you can't get at it. How long would the queues be outside high-street branches, and how long would you be stuck in one? My bank repeatedly offers me a far better interest rate if I switch to an account that's only accessible via the internet, but much as it pains me I always refuse.

Now let's get really scary and start to talk about the Stuxnet Worm and nuclear power stations, Chinese state-sponsored hackers, WikiLeaks and Anonymous and phishing and phone hacking. Is it possible that we haven't thought through the wisdom of permitting our whole lives to become dependent upon networks that no-one can police, and none but a handful of engineers understand or repair? When a landslide blocked the pass between our house and Castiglion Fiorentino a few years back, some men with a bulldozer from the Commune came to clear it, but at a pinch everyone in the upper valley could have pitched in (they all have tractors). Not so with fixing the internet. I might know quite a lot about TCP/IP, but I know bugger-all about cable splicing or the signalling layers of ATM and Frame Relay, or DSLAMs.

What contributes most to the fragility of our brave new connected world though is lack of buffering. Just-in-time manufacturing and distribution mean that no large stocks are kept of most commodities, everyone assuming that there will always be a constant stream from the continuous production line, always a delivery tomorrow. It's efficient, but if it stops you end up with nothing very fast. Our water is like that: shut off mains power and two seconds later, dry. I could fix that by building a water-tank and have the pump keep it full, via a ballcock valve like a big lavatory cistern. Then I could buy a lot of solar panels and a windmill to keep the pump running (plus my laptop). I could even buy a little diesel generator and run it on sunflower oil. I'm not saying I will, but I won't rule it out quite yet...

GRAND THEFT TECHNO

Dick Pountain/PC Pro/Idealog 205/  14/08/2011

Watching CCTV footage of the London riots shot from a high perspective, it was hard not to be reminded of video games like Grand Theft Auto. I don't want to open up that rusting can of worms about whether or not violent games cause imitation - the most I'll say is that these games perhaps provide an aesthetic for violence that happens anyway. The way participants wear their hoods, the way they leap to kick in windows, even the way they run now appears a little choreographed because we've seen the games. But this rather superficial observation underestimates the influence of digital technologies in the riots. The role of Blackberry messaging and Twitter in mustering rioters at their selected targets has been chewed over by the mainstream press ad nauseam, and David Cameron is now rumbling about suspending such services during troubles. This fits in well with my prediction, back in Idealog 197, that governments are now so nervous about the subversive potential of social media that the temptation to control them is becoming irresistible.

The influence of technology goes deeper still. The two categories of goods most looted during the riots were, unsurprisingly, clothes and electronic devices and the reason isn't too hard to deduce - brands have become a crucial expression of identity to several generations of kids. Danny Kruger, a youth worker and former adviser to David Cameron put it like this in the Financial Times: "We have a generation of young people reared on cheap luxuries, especially clothes and technology, but further than ever from the sort of wealth that makes them adults. A career, a home of your own - the things that can be ruined by riots - are out of sight. Reared on a diet of Haribo, who is surprised when they ransack the sweetshop?"

The latest phase of the hi-tech revolution makes this gap feel wider still. Neither PCs nor laptops were ever very widely desired: only nerds could set them up and they were barely usable for the exciting stuff like Facebook or 3D games. Steve Jobs and his trusty designer Jonathon Ive, together with Sony and Nintendo, changed that for ever. Electronic toy fetishism really took off with the iPod (which just about every kid in the UK now possesses) but it reached a new peak over the last year with the iPad, ownership of which has quickly become the badge of middle-class status. These riots weren't about relative poverty, nor unemployment, nor police brutality, nor were they just about grabbing some electronic toys for free. They were a raging (tinged with disgust) against exclusion from full membership of a world where helping yourself to public goods - as MPs and bankers are seen to do - is rife, and where you are judged by the number and quality of your toys. They demonstrated a complete collapse of respect for others' property.

I've been arguing for years that the digital economy is a threat to the very concept of property. Property is not a relationship between persons and things but rather a relationship between persons *about* things. This thing is mine, you can't take it, but I might give it, sell it or rent it to you. This relationship only persists so long as most people respect it and those who don't are punished by law. The property relationship originally derives from two sources: from the labour you put into getting or making a thing, and from that thing's *exclusivity* (either I have or you have it but not both of us). Things like air and seawater that lack such exclusivity have never so far been made into property, and digital goods, for an entirely different reason, fall into this category. Digital goods lack exclusivity because the cost of reproducing them tends toward zero, so both you and I can indeed possess the same game or MP3 tune, and I can give you a copy without losing my own. The artist who originally created that game or tune must call upon the labour aspect of property to protest that you are depriving them of revenue, but to end users copying feels like a victimless crime and what's more one for which punishment has proved very difficult indeed.

I find it quite easy to distinguish between digital and real (that is, exclusive) goods, since most digital goods are merely representations of real things. A computer game represents an adventure which in real life might involve you being shot dead. But I wonder whether recent generations of kids brought up with ubiquitous access to the digital world aren't losing this value distinction. I don't believe that violent games automatically inspire violence, but perhaps the whole experience of ripping, torrents and warez, of permanent instant communication with virtual friends, is as much responsible for destroying respect for property as weak parenting is. Those utopians who believe that the net could be the basis of a "gift economy" need to explain precisely how, if all software is going to be free, its authors are going to put real food on real tables in real houses that are really mortgaged. And politicians of all parties are likely to give the police ever more powers to demonstrate that life is not a computer game.

[Dick Pountain is writing a game about a little moustachioed Italian who steals zucchini from his neighbour's garden, called "Grand Theft Orto"]

UNTANGLED WEB?

Dick Pountain/PC Pro/Idealog 204/14/2011

In my early teens I avidly consumed science-fiction short stories (particularly anthologies edited by Groff Conklin), and one that stuck in my mind was "A Subway Named Moebius", written in 1950 by US astronomer A.J. Deutsch. It concerned the New York subway system, which in some imagined future had been extended and extended until its connectivity exceeded some critical threshold, so that when a new line was opened a train full of passengers disappeared into the fourth dimension where it could be heard circulating but never arrived. The title is an allusion to the Moebius Strip, an object beloved of topologists which is twisted in such a way that it only has a single side.

I've been reminded of this story often in the last few years, as I joined more and more social networks and attempted to optimise the links between them all. My first, and still favourite, is Flickr to which I've been posting photos for five years now. When I created my own website I put a simple link to my Flickr pix on it, but that didn't feel enough. I soon discovered that Google Sites support photogalleries and so placed a feed from my Flickr photostream on a page of my site. Click on one of these pictures and you're whisked across to Flickr.

Then I joined Facebook, and obviously I had to put links to Flickr and to my own site in my profile there. I started my own blog and of course wanted to attract visitors to it, so I started putting its address, along with that of my website, in the signature at the bottom of all my emails. Again that didn't feel like enough, so I scoured the lists of widgets available on Blogger and discovered one that would enable me to put a miniature feed from my blog onto the home page of my website. Visitors could now click on a post and be whisked over to the blog, while blog readers could click a link to go to my website.   

Next along came LibraryThing, a bibliographic site that permits you to list your book collection, share and compare it with other users. You might think this would take months of data entry, but the site is cleverly designed and connected to the librarians' ISBN database, so merely entering "CONRAD DARKNESS" will find all the various editions of The Heart of Darkness, and a single click on the right one enters all its details. I put 800+ books up in a couple of free afternoons. It's an interesting site for bookworms, particularly to find out who else owns some little-read tome. I suppose it was inevitable that LibraryThing would do a deal with Facebook, so I could import part of my library list onto my Facebook page too. 

I've written a lot here recently about my addiction to Spotify, where I appear to have accumulated 76 public playlists containing over 1000 tracks: several friends are also users and we swap playlists occasionally. But then, you guessed it, Spotify did a deal with Facebook, which made it so easy (just press a button) that I couldn't resist. Now down the right-hand edge of my Spotify window appears a list of all my Facebook friends who are also on Spotify - including esteemed editor Barry Collins - and can just click one to hear their playlists.

There are now so many different routes to get from each area of online presence to the others that I've completely lost count, and the chains of links often leave me wondering quite where I've ended up. I haven't even mentioned LinkedIn, because it has so far  refrained from integrating with Facebook (though of course my profile there has links to my Flickr, blog and websites). And this is just the connectivity between my own various sites: there's a whole extra level of complexity concerning content, because just about every web page I visit offers buttons to share it with friends or to Facebook or wherever.

It's all starting to feel like "A Social Network Named Moebius" and I half expect that one day I'll click a link and be flipped into the fourth dimension, where everything becomes dimly visible as through frosted glass and no-one can hear me shouting. That's why my interest was piqued by Kevin Partner's Online Business column in this issue, where he mentions a service called about.me. This site simply offers you a single splash page (free of charge at the moment) onto which you can place buttons and links to all your various forms of web content, so a visitor to this single page can click onto any of them. Now I only need add "about.me/dick.pountain" to each email instead of a long list of blogs and sites. Easy-to-use tools let you design a reasonably attractive page and offer help submitting it to the search engines - ideally it should become the first hit for your name in Google. Built-in analytical tools keep track of visits, though whether it's increased my traffic I couldn't say - I use the page myself, in preference to a dozen Firefox shortcuts.

[Dick Pountain regrets the name "about.me" has a slightly embarassing Californian ring to it - but that's enough about him]

PADDED SELL

Dick Pountain/PC Pro/12/06/2011/Idealog 203 

I'm a child of the personal computer revolution, one who got started in this business back in 1980 without any formal qualifications in computing as such. In fact I'd "used" London University's lone Atlas computer back in the mid 1960s, if by "used" you understand handing a pile of raw scintillation counter tapes to a man in a brown lab coat and receiving the processed results as a wodge of fanfold paper a week later. Everyone was starting out from a position of equal ignorance about these new toys, so it was all a bit like a Wild West land rush.

When Dennis Publishing (or H.Bunch Associates as it was then called) first acquired Personal Computer World magazine, I staked out my claim by writing a column on programmable calculators, which in those days were as personal as you could get, because like today's smartphones they fitted into your shirt-pocket. They were somewhat less powerful though: the Casio FX502 had a stonking 256 *bytes* of memory but I still managed to publish a noughts-and-crosses program for it that played a perfect game.

The Apple II and numerous hobbyist machines from Atari, Dragon, Exidy, Sinclair and others came and went, but eventually the CP/M operating system, soon followed by the IBM PC, enabled personal computers to penetrate the business market. There ensued a couple of decades of grim warfare during which the fleet-footed PC guerilla army gradually drove back the medieval knights-in-armour of the mainframe and minicomputer market, to create today's world of networked business PC computing. And throughout this struggle the basic ideology of the personal computing revolution could be simply expressed as "at least one CPU per user". The days of sharing one hugely-expensive CPU were over and nowadays many of us are running two or more cores each, even on some of the latest phones.

Focussing on the processor was perhaps inevitable because the CPU is a PC's "brain", and we're all besotted by the brainpower at our personal disposal. Nevertheless storage is equally important, perhaps even more so, for the conduct of personal information processing. Throughout the 30 years I've been in this business I've always kept my own data, stored locally on a disk drive that I own. It's a mixed blessing to say the least, and I've lost count of how many of these columns I've devoted to backup strategies, how many hours I've spent messing with backup configurations, how many CDs and DVDs of valuable data are still scattered among my bookshelves and friends' homes. As a result I've never lost any serious amount of data, but the effort has coloured my computing experience a grisly shade of paranoid puce. In fact the whole fraught business of running Windows - image backups, restore disks, reinstalling applications - could be painted in a similar dismal hue. 

In a recent column I confessed that nowadays I entrust my contacts and diary to Google's cloud, and that I'm impressed by the ease of installation and maintenance of Android apps. Messrs Honeyball and Cassidy regularly cover developments in virtualisation, cloud computing and centralised deployment and management that all conspire to reduce the neurotic burden of personal computing. But even with such technological progress it remains both possible and desirable to maintain your own local copy of your own data, and I still practise this by ticking the offline option wherever it's available. It may feel as though Google will be here forever, but you know that *nothing* is forever.

Sharing data between local and cloud storage forces you sharply up against licensing and IP (intellectual property) issues. Do you actually own applications, music and other data you download, even when you've paid for them? Most software EULAs say "no, you don't, you're just renting". The logic of 21st-century capitalism decrees IP to be the most valuable kind of asset (hence all that patent trolling) and the way to maximise profits is to rent your IP rather than sell it - charge by "pay-per-view" for both content and executables. But, despite Microsoft's byzantine licensing experiments, that isn't enforceable so long as people have real local storage because it's hard to grab stuff back from people's hard drives.

Enter Steve Jobs stage left, bearing iPad and wearing forked tail and horns. Following the recent launch of iCloud, iPad owners no longer need to own either a Mac or a Windows PC to sync their music and apps with iTunes. Microsoft is already under great pressure from Apple's phenomenal tablet success, and might just decide to go the same way by allowing Windows phones and tablets to sync directly to the cloud. In that case sales of consumer PCs and laptops are destined to fall, and with them volume hard disk manufacture. The big three disk makers have sidestepped every prediction of their demise for 20 years, but this time it might really be the beginning of the end. Maybe it will take five or ten years, but a world equipped only with flash-memory tablets syncing straight to cloud servers is a world that's ripe for a pay-per-view counter-revolution. Don't say you haven't been warned.

[Dick Pountain can still remember when all his data would fit onto three 5.25" floppy disks]

POD PEOPLE

Dick Pountain /Idealog 366/ 05 Jan 2025 03:05 It’s January, when columnists feel obliged to reflect on the past year and who am I to refuse,...