Dick Pountain/PC Pro/12/06/2011/Idealog 203
I'm a child of the personal computer revolution, one who got started in this business back in 1980 without any formal qualifications in computing as such. In fact I'd "used" London University's lone Atlas computer back in the mid 1960s, if by "used" you understand handing a pile of raw scintillation counter tapes to a man in a brown lab coat and receiving the processed results as a wodge of fanfold paper a week later. Everyone was starting out from a position of equal ignorance about these new toys, so it was all a bit like a Wild West land rush.
When Dennis Publishing (or H.Bunch Associates as it was then called) first acquired Personal Computer World magazine, I staked out my claim by writing a column on programmable calculators, which in those days were as personal as you could get, because like today's smartphones they fitted into your shirt-pocket. They were somewhat less powerful though: the Casio FX502 had a stonking 256 *bytes* of memory but I still managed to publish a noughts-and-crosses program for it that played a perfect game.
The Apple II and numerous hobbyist machines from Atari, Dragon, Exidy, Sinclair and others came and went, but eventually the CP/M operating system, soon followed by the IBM PC, enabled personal computers to penetrate the business market. There ensued a couple of decades of grim warfare during which the fleet-footed PC guerilla army gradually drove back the medieval knights-in-armour of the mainframe and minicomputer market, to create today's world of networked business PC computing. And throughout this struggle the basic ideology of the personal computing revolution could be simply expressed as "at least one CPU per user". The days of sharing one hugely-expensive CPU were over and nowadays many of us are running two or more cores each, even on some of the latest phones.
Focussing on the processor was perhaps inevitable because the CPU is a PC's "brain", and we're all besotted by the brainpower at our personal disposal. Nevertheless storage is equally important, perhaps even more so, for the conduct of personal information processing. Throughout the 30 years I've been in this business I've always kept my own data, stored locally on a disk drive that I own. It's a mixed blessing to say the least, and I've lost count of how many of these columns I've devoted to backup strategies, how many hours I've spent messing with backup configurations, how many CDs and DVDs of valuable data are still scattered among my bookshelves and friends' homes. As a result I've never lost any serious amount of data, but the effort has coloured my computing experience a grisly shade of paranoid puce. In fact the whole fraught business of running Windows - image backups, restore disks, reinstalling applications - could be painted in a similar dismal hue.
In a recent column I confessed that nowadays I entrust my contacts and diary to Google's cloud, and that I'm impressed by the ease of installation and maintenance of Android apps. Messrs Honeyball and Cassidy regularly cover developments in virtualisation, cloud computing and centralised deployment and management that all conspire to reduce the neurotic burden of personal computing. But even with such technological progress it remains both possible and desirable to maintain your own local copy of your own data, and I still practise this by ticking the offline option wherever it's available. It may feel as though Google will be here forever, but you know that *nothing* is forever.
Sharing data between local and cloud storage forces you sharply up against licensing and IP (intellectual property) issues. Do you actually own applications, music and other data you download, even when you've paid for them? Most software EULAs say "no, you don't, you're just renting". The logic of 21st-century capitalism decrees IP to be the most valuable kind of asset (hence all that patent trolling) and the way to maximise profits is to rent your IP rather than sell it - charge by "pay-per-view" for both content and executables. But, despite Microsoft's byzantine licensing experiments, that isn't enforceable so long as people have real local storage because it's hard to grab stuff back from people's hard drives.
Enter Steve Jobs stage left, bearing iPad and wearing forked tail and horns. Following the recent launch of iCloud, iPad owners no longer need to own either a Mac or a Windows PC to sync their music and apps with iTunes. Microsoft is already under great pressure from Apple's phenomenal tablet success, and might just decide to go the same way by allowing Windows phones and tablets to sync directly to the cloud. In that case sales of consumer PCs and laptops are destined to fall, and with them volume hard disk manufacture. The big three disk makers have sidestepped every prediction of their demise for 20 years, but this time it might really be the beginning of the end. Maybe it will take five or ten years, but a world equipped only with flash-memory tablets syncing straight to cloud servers is a world that's ripe for a pay-per-view counter-revolution. Don't say you haven't been warned.
[Dick Pountain can still remember when all his data would fit onto three 5.25" floppy disks]
My columns for PC Pro magazine, posted here six months in arrears for copyright reasons
Tuesday, 3 July 2012
Subscribe to:
Post Comments (Atom)
CHINA SYNDROME
Dick Pountain /Idealog 357/ 08 Apr 2024 01:09 Unless you live permanently as an avatar in Second Life [does that even still exist?] then it ...
-
Dick Pountain/Idealog 262/05 May 2016 11:48 The ability to explain algorithms has always been a badge of nerdhood, the sort of trick peopl...
-
Dick Pountain /Idealog 340/ 10 Nov 2022 09:53 I live in Camden Town, close to The Regent’s Canal down which I can walk in 10 minutes to King...
-
Dick Pountain /Idealog 344/ 05 Mar 2023 02:51 I was born and schooled among the coalfields of North East Derbyshire, but I no longer have mu...
No comments:
Post a Comment