Dick Pountain/PC Pro/Idealog 219 07/10/2012
A wave of nostalgia about old-school personal computing is going around the Web, with people restoring, emulating or reminiscing about Sinclair Spectrums, Jupiter Aces, Dragons, Orics and many more, wherever you look. I'm not too proud to jump on a passing bandwagon, so here's my own reminiscence. I bought my first "computer" - a Casio fx-201P, one of the very first programmable calculators - around 1977. It had a 10-digit green fluorescent display and 127 steps of program memory (no editor: make a mistake, enter them all again). I programmed it to check printers' invoices for magazines with varying numbers of pages, sections and amounts of colour, which impressed the printers who had no such assistance themselves, and when in 1979 Dennis Publishing acquired Personal Computer World, it earned me a column called "Calculator Corner" of which this column could be seen as a direct descendant.
That Casio was a fairly chunky beast at 7 x 4 x 1.5 inches and it weighed 13oz, which hardly mattered since it remained on my desk. By the oddest of coincidences my latest "computer", a Google Nexus 7, is around the same size at 7.8 x 4.7 x 0.4 inches and weighs 12oz, though it contains around 128 million times more memory and runs over 100,000 times faster. Oh, and instead of 10 green glowing digits its display shows streaming movies and TV. Had cars advanced at a similar rate then just one could carry the whole population of England to the moon in under two minutes. The German philosopher Hegel famously claimed that sufficient quantitative change leads eventually to qualitative change, and that's what I'm feeling right now about my Nexus, that tablet computers are poised to change the game.
During these years between the Casio and the Nexus I've spent a lot of energy pursuing a particular idea of computing, deeply influenced by Alan Kay's notion of the "Dynabook", his universal portable information store. I've tried and abandoned several drawers-ful of pocketable computing devices, looking for one that would sync transparently to a desktop computer. I've rejected a desktop altogether in favour of a powerful laptop. I'd reached the point, as described in last month's column, where I can use my Android smartphone to share files with my laptop via Dropbox, but it was all still a bit fiddly with data entry via the phone too slow and a screen rather too small for viewing complex websites. Of course I expected a tablet to improve things in both those respects, but I didn't anticipate by quite how much.
Being an Android device the Nexus immediately grabbed all my contacts, calendar and mail from Google's cloud with no effort, and I soon had all my preferred apps (in latest their Jelly Bean versions) installed. On installing File Manager HD I noticed a new menu option called LAN Connection, and despite my acute networkophobia I tapped the Scan icon to see what would happen. After an agonising delay it came back with a connection to "USER-PC", my laptop! It took me a further afternoon of wading through Microsoft's grim network model - what is a Homegroup and how is it different from a Workgroup? - but eventually I got everything I wanted shared. No need to duplicate any music, videos, documents to the tablet, just access them over Wi-Fi.
You have to understand that I work at home where I'm either sitting at my desk upstairs in front of my laptop, or on the sofa downstairs reading books and making notes - now I may have to adopt some vigorous exercise regime to replace all the stair-climbing I no longer need do. I rarely work away from home so the Nexus's lack of 3G isn't critical, and in any case I wouldn't want to pay for another SIM, but then I discovered that the "Tethering & Portable Hotspot" setting of my phone actually works. BT Fon already gives me free Wi-Fi throughout much of London, but where it doesn't I can Google and Wikipede via my phone's Wi-Fi.
So far I've never been even slightly tempted by any of the Home Server or Media PC offerings, but now I'm beginning to see the possibility of a different sort of animal: a tiny Linux box containing a 1TB disk and a Wi-Fi router, with no display or keyboard. All it does is locally store data from PCs, tablets and phones over Wi-Fi, while continually backing itself up to Dropbox or some other cloud service. No shared media streaming, do all that at the client ends (I have Spotify even on my phone). And I've started to feel wallet palpitations of almost Honeyballian intensity whenever I see ads for the Asus Transformer...
My columns for PC Pro magazine, posted here six months in arrears for copyright reasons
Thursday, 11 April 2013
Wednesday, 10 April 2013
NOTA BENE
Dick Pountain/PC Pro/Idealog 218 06/09/2012
It sometimes feels as though I've been taking notes all my life. Certainly I was already doing it in school, and in university lectures, when in combination with my photographic memory it was a great advantage in exams: I could just conjure up the page of my notebook where an answer lay. (That memory is now fading, but luckily for me computers are improving at more or less the same rate). Right from the start personal computing for me meant trying to find some practical way to take notes on the damned things. Of course for a writer finding a decent writing tool was the first priority, but that proved nowhere near so hard to fullfil. For each successive OS since CP/M 2.2, I quickly discovered a word-processor or editor that would serve me well for years - Wordstar, PC-Write, TextPad, Microsoft Word - but for each OS I also wasted hours trying and rejecting inadequate candidates for the role of note-taker. The top drawer of my grey filing cabinet testifies to my failure, because it's half full of spiral-bound reporter's pads containing 20-years-worth of pencil scribblings.
It wasn't until the first Palm Pilot came out in 1996 that things looked up a bit. A crucial attribute of any note-taking system is portability, because ideas pop into my head at all times and places, even in bed at night, and having to plod to a desktop computer to record them is a total no-no. Palm got me to a point where I could be sitting anywhere, perhaps reading a book, with a Pilot at my elbow to scribble notes using Graffiti handwriting, and have them transfer to my desktop PC whenever I synced. Soon I discovered Natara's Bonsai, a neat outliner that ran on both PC and Palm, and no less than 125 of these Idealog columns were planned in that program. That Bonsai lasted me ten-years proves it was workable, but it still wasn't ideal: it couldn't handle pictures or diagrams, and folding editors actually aren't, contrary to what you might expect, that much help on a tiny handheld screen. And Palm's syncing worked, but only so long as you remembered to do it...
After Palm went under I moved over to an Android phone, which opened up whole new cloudy vistas. Bonsai never made the leap and stuck with Windows Mobile, but there are dozens of Android outliner apps and I've tried most of them. Many of the free ones work well but have neither a Windows sync client nor cloud storage. Then there are big beasts like Zotero, Evernote and SimpleNote that offer both cloud service and PC sync. I decided to try the free version of Evernote and was very excited for a while. It's a whole ecosystem, with add-ons for drawing sketches and clipping web pages, and it has an attractive user interface. Notes handwritten on my phone (using the marvellous Graffiti Pro app) just appear on my laptop without effort. Until one day the Evernote Windows client just vanished from my PC, without a trace. I hasten to add that no notes were lost - they're all still there in my account on Evernote's website - but it disconcerted me when the same happened again weeks after I reinstalled it. The cloud is mighty powerful, and this ability to remove things from my PC without asking has quite blunted my enthusiasm for the product.
It was around then PC Pro adopted DropBox to deliver Real World Computing copy, and the penny dropped that I can now roll my own cloudy note-taking solution using the excellent DropBox client for Android. Just create a directory tree called Notes in the Dropbox folder and bung all text, pictures, spreadsheets, whatever, relating to a project into the same subdirectory. Stick to a few file formats like text, JPG, docx and xlsx (I have Documents To Go on my phone). And TextPad lets me drag web URLs straight from Firefox into a note and access them by right-clicking. Sorted.
And what, I hear you mutter, about Microsoft's OneNote? Well, whenever Simon Jones has demonstrated it to me on his rarer-than-hens-teeth Samsung Slate PC I've been bowled over by its unique, industry-beating capabilities. But there's the rub: like almost everyone else I never bought a Windows Tablet or Slate PC, and Microsoft never provided me a copy with any version of Office I've had. In fact, so effectively have they've kept this killer app away from the public that they ought to be put in charge of Hantavirus quarantine. Now Redmond is betting the farm on Windows 8 and my advice would be, make your Surfaces (or whatever they're called this week) into dynamite OnceNote engines, and let them communicate easily with your competitors' devices: the iPad currently has nothing to touch it for note-taking.
It sometimes feels as though I've been taking notes all my life. Certainly I was already doing it in school, and in university lectures, when in combination with my photographic memory it was a great advantage in exams: I could just conjure up the page of my notebook where an answer lay. (That memory is now fading, but luckily for me computers are improving at more or less the same rate). Right from the start personal computing for me meant trying to find some practical way to take notes on the damned things. Of course for a writer finding a decent writing tool was the first priority, but that proved nowhere near so hard to fullfil. For each successive OS since CP/M 2.2, I quickly discovered a word-processor or editor that would serve me well for years - Wordstar, PC-Write, TextPad, Microsoft Word - but for each OS I also wasted hours trying and rejecting inadequate candidates for the role of note-taker. The top drawer of my grey filing cabinet testifies to my failure, because it's half full of spiral-bound reporter's pads containing 20-years-worth of pencil scribblings.
It wasn't until the first Palm Pilot came out in 1996 that things looked up a bit. A crucial attribute of any note-taking system is portability, because ideas pop into my head at all times and places, even in bed at night, and having to plod to a desktop computer to record them is a total no-no. Palm got me to a point where I could be sitting anywhere, perhaps reading a book, with a Pilot at my elbow to scribble notes using Graffiti handwriting, and have them transfer to my desktop PC whenever I synced. Soon I discovered Natara's Bonsai, a neat outliner that ran on both PC and Palm, and no less than 125 of these Idealog columns were planned in that program. That Bonsai lasted me ten-years proves it was workable, but it still wasn't ideal: it couldn't handle pictures or diagrams, and folding editors actually aren't, contrary to what you might expect, that much help on a tiny handheld screen. And Palm's syncing worked, but only so long as you remembered to do it...
After Palm went under I moved over to an Android phone, which opened up whole new cloudy vistas. Bonsai never made the leap and stuck with Windows Mobile, but there are dozens of Android outliner apps and I've tried most of them. Many of the free ones work well but have neither a Windows sync client nor cloud storage. Then there are big beasts like Zotero, Evernote and SimpleNote that offer both cloud service and PC sync. I decided to try the free version of Evernote and was very excited for a while. It's a whole ecosystem, with add-ons for drawing sketches and clipping web pages, and it has an attractive user interface. Notes handwritten on my phone (using the marvellous Graffiti Pro app) just appear on my laptop without effort. Until one day the Evernote Windows client just vanished from my PC, without a trace. I hasten to add that no notes were lost - they're all still there in my account on Evernote's website - but it disconcerted me when the same happened again weeks after I reinstalled it. The cloud is mighty powerful, and this ability to remove things from my PC without asking has quite blunted my enthusiasm for the product.
It was around then PC Pro adopted DropBox to deliver Real World Computing copy, and the penny dropped that I can now roll my own cloudy note-taking solution using the excellent DropBox client for Android. Just create a directory tree called Notes in the Dropbox folder and bung all text, pictures, spreadsheets, whatever, relating to a project into the same subdirectory. Stick to a few file formats like text, JPG, docx and xlsx (I have Documents To Go on my phone). And TextPad lets me drag web URLs straight from Firefox into a note and access them by right-clicking. Sorted.
And what, I hear you mutter, about Microsoft's OneNote? Well, whenever Simon Jones has demonstrated it to me on his rarer-than-hens-teeth Samsung Slate PC I've been bowled over by its unique, industry-beating capabilities. But there's the rub: like almost everyone else I never bought a Windows Tablet or Slate PC, and Microsoft never provided me a copy with any version of Office I've had. In fact, so effectively have they've kept this killer app away from the public that they ought to be put in charge of Hantavirus quarantine. Now Redmond is betting the farm on Windows 8 and my advice would be, make your Surfaces (or whatever they're called this week) into dynamite OnceNote engines, and let them communicate easily with your competitors' devices: the iPad currently has nothing to touch it for note-taking.
Tuesday, 26 February 2013
SMART SABOTAGE
Dick Pountain/PC Pro/Idealog 217 10/08/2012
Turning information into money has become a most pressing problem. Much of our information torrent still arrives free of charge, but its owners are desperate to find ways to get us to pay for it. This desperation manifests itself in many ways, some of which don't appear connected with information at all. For example those vicious legal spats between Apple and Samsung over patents are ultimately about the ownership of information, Apple's claimed ownership of the idea of touch-based tablet computers, expressed in the form of patents on various specific features.
Or again, Facebook's ever-more-convoluted user interface is all aimed to extract more information about your likes and habits, so as to sell them to advertisers. The arena of telecommunications is a vast battlefield on which two information wars are raging simultaneously: the last war, not quite over yet, between the old land-line-based giant telcos and the new mobile phone operators for your voice-call traffic; and a new war, hotting up, between those mobile operators and IT companies like Apple and Google who seek to grab all their data traffic via paid-for smartphone and tablet apps. According to research outfit Ovum, SMS text traffic is worth around 150 *billion* dollars each year to the mobo operators, but over a third of iPhone users are already switching to IP-based messaging services like Pinger. Like the old telcos, the mobile ops risk being reduced to merely owning the pipes through which other people's profitable content flows.
Apple is currently having a rather good war, having ruthlessly preserved a proprietory grip over its own hardware ecosystem and exploited this to get users paying for apps and content through online stores. Its carpet bombing of Adobe's Flash - by excluding it from the iPad - is a tactical victory, damming off one whole stream of free content from the internet. However one battle isn't the whole war. In last month's column I described my experience of using an iPad on 3G in Italy, but what I didn't mention was that the deal we get for it is grossly inferior to that for my Windows laptop (€24 per month for 10GB as opposed to €19 per month unlimited). I grumbled to the chap in the TIM shop, but that's their only deal for iPads: they presumably analysed the traffic profile of iPad-owning users and decided that's the only way to profit from them.
Bandwidth is a valuable commodity and extracting maximum profit from such commodities is a science that involves some strange paradoxes. You might think the most profit could be obtained by producing the largest amount of a highly desirable commodity, but that's far from the truth. Pricing is everything, and often maximum profit is achieved by restricting supply to raise the price. The archetypical case is of course the De Beers family's rigid control over the world supply of diamonds, but the oil industry is a pretty good example too. For most of the 20th century there was always a large surplus of oil reserves, but oil companies wouldn't pump so much as to lower the price too far: Daniel Yergin's massive tome "The Prize: The Epic Quest for Oil, Money & Power" describes in entertaining detail the hoops they jumped through to prevent new fields being exploited by rivals. Contrary to much free-market dogma, companies don't enjoy price competition, and very large companies will go to great (sometimes too great) expense to avoid or subvert it.
The great theorist of such pricing policies was the eccentric Norwegian/American economist Thorstein Veblen, the man who gave us the term "conspicuous consumption". He delighted in provoking with mocking and ironic terminology, and the term he chose for this case was "sabotage". A partisan blowing up a railway line, a striking worker dropping a spanner in his machine, an ISP throttling your internet feed, they're all doing the same thing: deliberately reducing production to achieve certain ends. Veblen defined sabotage as "a conscientious withdrawal of efficiency" and did not regard it as a pejorative term: on the contrary "the common welfare in any community which is organized on the price system cannot be maintained without a salutary use of sabotage... such restriction of output as will maintain prices at a reasonably profitable level and so guard against business depression". Our current economic crisis is largely one of overproduction, for example of cheap mortgages. The alternative to sabotage then is to do away with prices and make everything free, but that invites massive overconsumption (the so-called "tragedy of the commons"). Eventually things would have to be rationed by other means, often not nice ones. That it's better to pay people enough to buy stuff ought to be obvious, but seems to escape politicians and employers alike.
Turning information into money has become a most pressing problem. Much of our information torrent still arrives free of charge, but its owners are desperate to find ways to get us to pay for it. This desperation manifests itself in many ways, some of which don't appear connected with information at all. For example those vicious legal spats between Apple and Samsung over patents are ultimately about the ownership of information, Apple's claimed ownership of the idea of touch-based tablet computers, expressed in the form of patents on various specific features.
Or again, Facebook's ever-more-convoluted user interface is all aimed to extract more information about your likes and habits, so as to sell them to advertisers. The arena of telecommunications is a vast battlefield on which two information wars are raging simultaneously: the last war, not quite over yet, between the old land-line-based giant telcos and the new mobile phone operators for your voice-call traffic; and a new war, hotting up, between those mobile operators and IT companies like Apple and Google who seek to grab all their data traffic via paid-for smartphone and tablet apps. According to research outfit Ovum, SMS text traffic is worth around 150 *billion* dollars each year to the mobo operators, but over a third of iPhone users are already switching to IP-based messaging services like Pinger. Like the old telcos, the mobile ops risk being reduced to merely owning the pipes through which other people's profitable content flows.
Apple is currently having a rather good war, having ruthlessly preserved a proprietory grip over its own hardware ecosystem and exploited this to get users paying for apps and content through online stores. Its carpet bombing of Adobe's Flash - by excluding it from the iPad - is a tactical victory, damming off one whole stream of free content from the internet. However one battle isn't the whole war. In last month's column I described my experience of using an iPad on 3G in Italy, but what I didn't mention was that the deal we get for it is grossly inferior to that for my Windows laptop (€24 per month for 10GB as opposed to €19 per month unlimited). I grumbled to the chap in the TIM shop, but that's their only deal for iPads: they presumably analysed the traffic profile of iPad-owning users and decided that's the only way to profit from them.
Bandwidth is a valuable commodity and extracting maximum profit from such commodities is a science that involves some strange paradoxes. You might think the most profit could be obtained by producing the largest amount of a highly desirable commodity, but that's far from the truth. Pricing is everything, and often maximum profit is achieved by restricting supply to raise the price. The archetypical case is of course the De Beers family's rigid control over the world supply of diamonds, but the oil industry is a pretty good example too. For most of the 20th century there was always a large surplus of oil reserves, but oil companies wouldn't pump so much as to lower the price too far: Daniel Yergin's massive tome "The Prize: The Epic Quest for Oil, Money & Power" describes in entertaining detail the hoops they jumped through to prevent new fields being exploited by rivals. Contrary to much free-market dogma, companies don't enjoy price competition, and very large companies will go to great (sometimes too great) expense to avoid or subvert it.
The great theorist of such pricing policies was the eccentric Norwegian/American economist Thorstein Veblen, the man who gave us the term "conspicuous consumption". He delighted in provoking with mocking and ironic terminology, and the term he chose for this case was "sabotage". A partisan blowing up a railway line, a striking worker dropping a spanner in his machine, an ISP throttling your internet feed, they're all doing the same thing: deliberately reducing production to achieve certain ends. Veblen defined sabotage as "a conscientious withdrawal of efficiency" and did not regard it as a pejorative term: on the contrary "the common welfare in any community which is organized on the price system cannot be maintained without a salutary use of sabotage... such restriction of output as will maintain prices at a reasonably profitable level and so guard against business depression". Our current economic crisis is largely one of overproduction, for example of cheap mortgages. The alternative to sabotage then is to do away with prices and make everything free, but that invites massive overconsumption (the so-called "tragedy of the commons"). Eventually things would have to be rationed by other means, often not nice ones. That it's better to pay people enough to buy stuff ought to be obvious, but seems to escape politicians and employers alike.
Thursday, 31 January 2013
SMELLING THE COFFEE
Dick Pountain/PC Pro/Idealog 216 09/07/2012
I'm writing this column at my desk in Italy, on a balmy evening, watching fireflies drift in and out among the vines. (I thought I'd see, as an experiment, whether I could type that sentence without smirking, but the reflection in my window pane says I've failed...) Seriously though, putting all climatological disparities aside, our existence here is a remarkable testament to advances in comms technology over the last few years. Communication with the rest of the world happens through a Telecom Italia Mobile mast on the mountain opposite, via which my laptop and my partner Marion's iPad are connected on pay-as-you-go mobile data plans. These are now fast enough to watch live streamed television, listen to music and radio, without recourse to the huge and expensive satellite dishes that were required just a couple of years ago. And my ALICE package gives me unlimited data for €19 per month.
Being here has given me the opportunity to get to grips with the iPad, and hence caused me to oscillate wildly between very impressed indeed and hair-tearing frustration. The latter state is almost always induced either by lack of documentation, or by Apple's smug assumption that everyone buys into its total ecosystem, which I most certainly do not. A most egregious example of the latter concerned Marion's contacts information, which it fell my duty to transfer from her netbook back in London into the iPad's Contacts. For purely historical reasons these have been kept for many years in Palm Desktop rather than Outlook (and she's quite happy with its facilities). When the iPad first arrived I realised there wasn't going to be any direct way to export addresses to it, because its Contacts Book appears to lack any menu for importing stuff. So I opened a Gmail account for her and successfully exported them all into that via a CSV file, believing job done. Fat chance. The Apple Contacts Book of course lacks any mechanism for importing from the enemy GMail either. After hours of footling around I gave up and suggested she log onto Gmail via Safari to see her contacts.
Months passed and a friend loaned us a book called "iPad 2: the missing manual" which solved many puzzles, like how to recover when you've accidentally locked the screen orientation into portrait. One chapter began with the soothing words "Putting a copy of your contacts file onto your iPad is easy" and suggested using either iCloud or iTunes. Like an idiot I decided iTunes would be easier since it was already installed there on the iPad's home screen. More head-scratching followed because iTunes would do nothing but offer to sell me David Guetta albums.
I'm embarassed to tell how many hours it took me to realise that iTunes has to be installed on another computer (assumed to be a Mac) rather than the iPad for this task. I flirted briefly with the idea of polluting my Viao with the Apple software, but fortunately took the precaution of Googling "uninstall iTunes from Windows 7" before committing: dozens of horror stories about how much junk it leaves behind cured me completely of that impulse. I Googled some more and then suddenly the scales were removed from my eyes by a sane and crystal-clear blog called "Apple iPad Tablet Help". The answer is use iCloud stupid!
It took about ten minutes once that penny dropped. Pull up GMail on my Viao; log out as me and log back in as Marion; export her GMail contacts to a vCard file on the Viao; go to www.icloud.com and log in with Marion's Apple ID; drag the downloaded .VCF file onto the Contacts icon in the iCloud window. During this whole procedure the iPad remained lying on a table in the other room and no cables were involved. I fetched the iPad and opened Contacts, where to my disappointment were just those three entries I'd added manually months before. But before I could even muster a curse, up popped another, then another and in they all streamed over the airwaves, all 1200+ of them, in less than a minute. The moral of the story for me is that The Cloud just works: feeble documentation, different OSes, squabbles between Apple and Google, smug assumptions that iPad owners have a Mac too, all just melted away in the universality of HTTP and the internet.
This was the same month we finally decided to adopt Dropbox in place of Dennis Publishing's own server to transport Real World copy, and so far it's proving more convenient and reliable for all concerned. The Cloud just works. (Of course being a cynic/paranoiac I download it all to archive on my local machine too). For extra cloudiness I've also recently built an archive of all my previous Idealog columns from 1994 to 2012 in Blogger, where you can read them on-screen in a nice convenient format. As I was uploading one from August 1996 its headline, Wake Me When It All Works, caught my eye. In it I complained: "Somewhere along the line everyone seems to have forgotten once again that simplifying means actually removing stuff, not just hiding it on the ninteenth tab of some dialog". I believe I can smell the coffee...
[Dick Pountain's back issues of this Idealog column are now readable on http://www.dickpountain-idealog.blogspot.it/]
I'm writing this column at my desk in Italy, on a balmy evening, watching fireflies drift in and out among the vines. (I thought I'd see, as an experiment, whether I could type that sentence without smirking, but the reflection in my window pane says I've failed...) Seriously though, putting all climatological disparities aside, our existence here is a remarkable testament to advances in comms technology over the last few years. Communication with the rest of the world happens through a Telecom Italia Mobile mast on the mountain opposite, via which my laptop and my partner Marion's iPad are connected on pay-as-you-go mobile data plans. These are now fast enough to watch live streamed television, listen to music and radio, without recourse to the huge and expensive satellite dishes that were required just a couple of years ago. And my ALICE package gives me unlimited data for €19 per month.
Being here has given me the opportunity to get to grips with the iPad, and hence caused me to oscillate wildly between very impressed indeed and hair-tearing frustration. The latter state is almost always induced either by lack of documentation, or by Apple's smug assumption that everyone buys into its total ecosystem, which I most certainly do not. A most egregious example of the latter concerned Marion's contacts information, which it fell my duty to transfer from her netbook back in London into the iPad's Contacts. For purely historical reasons these have been kept for many years in Palm Desktop rather than Outlook (and she's quite happy with its facilities). When the iPad first arrived I realised there wasn't going to be any direct way to export addresses to it, because its Contacts Book appears to lack any menu for importing stuff. So I opened a Gmail account for her and successfully exported them all into that via a CSV file, believing job done. Fat chance. The Apple Contacts Book of course lacks any mechanism for importing from the enemy GMail either. After hours of footling around I gave up and suggested she log onto Gmail via Safari to see her contacts.
Months passed and a friend loaned us a book called "iPad 2: the missing manual" which solved many puzzles, like how to recover when you've accidentally locked the screen orientation into portrait. One chapter began with the soothing words "Putting a copy of your contacts file onto your iPad is easy" and suggested using either iCloud or iTunes. Like an idiot I decided iTunes would be easier since it was already installed there on the iPad's home screen. More head-scratching followed because iTunes would do nothing but offer to sell me David Guetta albums.
I'm embarassed to tell how many hours it took me to realise that iTunes has to be installed on another computer (assumed to be a Mac) rather than the iPad for this task. I flirted briefly with the idea of polluting my Viao with the Apple software, but fortunately took the precaution of Googling "uninstall iTunes from Windows 7" before committing: dozens of horror stories about how much junk it leaves behind cured me completely of that impulse. I Googled some more and then suddenly the scales were removed from my eyes by a sane and crystal-clear blog called "Apple iPad Tablet Help". The answer is use iCloud stupid!
It took about ten minutes once that penny dropped. Pull up GMail on my Viao; log out as me and log back in as Marion; export her GMail contacts to a vCard file on the Viao; go to www.icloud.com and log in with Marion's Apple ID; drag the downloaded .VCF file onto the Contacts icon in the iCloud window. During this whole procedure the iPad remained lying on a table in the other room and no cables were involved. I fetched the iPad and opened Contacts, where to my disappointment were just those three entries I'd added manually months before. But before I could even muster a curse, up popped another, then another and in they all streamed over the airwaves, all 1200+ of them, in less than a minute. The moral of the story for me is that The Cloud just works: feeble documentation, different OSes, squabbles between Apple and Google, smug assumptions that iPad owners have a Mac too, all just melted away in the universality of HTTP and the internet.
This was the same month we finally decided to adopt Dropbox in place of Dennis Publishing's own server to transport Real World copy, and so far it's proving more convenient and reliable for all concerned. The Cloud just works. (Of course being a cynic/paranoiac I download it all to archive on my local machine too). For extra cloudiness I've also recently built an archive of all my previous Idealog columns from 1994 to 2012 in Blogger, where you can read them on-screen in a nice convenient format. As I was uploading one from August 1996 its headline, Wake Me When It All Works, caught my eye. In it I complained: "Somewhere along the line everyone seems to have forgotten once again that simplifying means actually removing stuff, not just hiding it on the ninteenth tab of some dialog". I believe I can smell the coffee...
[Dick Pountain's back issues of this Idealog column are now readable on http://www.dickpountain-idealog.blogspot.it/]
LOOKIN' GOOD!
Dick Pountain/PC Pro/Idealog 215 07/06/2012
I just upgraded my mobile phone from a ZTE San Francisco to a ZTE San Francisco 2 and it's a significant improvement. It has a faster processor, a far better camera with flash, and it runs a later version of Android. But most important of all, it doesn't have those two chrome strips down either side. "Er, is he going soft in the head"? you may be thinking. Well no, I don't actually give a damn about those chrome strips, but most of the online reviews of this phone I've read mentioned them in their first paragraph. It appears (geddit), that people are becoming as obsessed with the appearance of their gadgets as they already are with their haircuts, clothes, cars and sofas. And it's not only physical gadgets like phones but also software interfaces. I'd love to write a smug, judgmental column that argues everyone else is obsessed with appearances whereas I just don't care about such trivia, only the deepest essences of things. I'd love to, but in all honesty I can't, because I'm no better than anyone else in this respect. I don't give a damn about those particular chrome strips, but I'm fanatical about software user interfaces and have dumped many perfectly functional utilities because I couldn't stand the cut (or colour scheme) of their jib.
This phenomenon is not of course confined to the IT business. Reflect for a moment about the explosion of interest in all areas of design - from consumer goods to architecture and engineering - over the last three decades or so. Apple design guru Jonathon Ives was just knighted, while architects nowadays have the celebrity status of movie stars. This is a real social phenomenon, and it's of far more than just sociological interest because its economic consequences are profound. How many new car models failed because a consensus emerged that they looked awful (and I don't JUST mean the Sinclair C5, that's a lazy choice: there's also the BL Mini Metro, especially in that unique poo-brown paintjob). The plain fact is that everyone's a critic and aesthete nowadays, with major consequences for industries (both consumer and technical) that can hardly be overestimated. If you produce something that potential users find ugly you're in big trouble, and in areas like computer or phone operating systems, where development budgets run into the billions, that can matter a great deal. Which explains the almost comically paranoid behaviour of certain big IT companies, because some of the design decisions involved are now too big for mere mortals to make without going a little bit mad.
Two of these terrible quandaries are examined by RWC columnists in this very issue. Jon Honeyball writes about Microsoft's dithering over the look-and-feel of Windows 8, which is approaching Hamlet-like proportions. Redmond chickened-out from incorporating the final look into the Release Candidate build and Jon suspects this is because they're panicking, still trying out different tie-and-handkerchief combinations on secret focus groups. Locked in a death struggle with Apple's iPad, the stakes are too high to get it wrong, but the decision is too big for anyone's sanity. We do know that they've dumped the "Aero-glass" theme for window borders they so proudly introduced with Windows Vista, describing it as now "dated and cheesy" and certainly not "en vogue". (Interpreted, that means we're terrified that YOU think it's cheesy, and we want to get our capitulation in before your attack). Actually I like the Aero look, as indeed I like cheese, but there's a certain grim irony in this situation because it was Microsoft who started the whole trend 20-years ago, fussing over the look-and-feel of early Windows versions, being first to hire big-bucks graphic designers and useability teams.
Meanwhile in his column Simon Jones describes a user revolt among programmers over the colour-scheme in Visual Studio 11 Beta. Its designers decided to remove most colour from its user interface, substituting small indecipherable monochrome icons and menu options in ALL CAPITALS. I'm hardly surprised developers are on the warpath. Programming is the worst area (except perhaps for writing) to radically fiddle with user interfaces: those hypnotically repetitive loops of edit, compile, run, edit, compile, run are only made tolerable because you've totally internalised the position of every single button and option, so your fingers run on autopilot without conscious intervention. Upset that rhythm and productivity may be ruined for months until you've internalised the new set. The designers may have been right and that too much colour was distracting - doesn't matter when people are adapted to that distraction.
For similar reasons I personally loathe Facebook's imposition of the new Timeline, which depresses me because my profile is now 34 feet long and extends below the floorboards. I've always hated Facebook's interface anyway, but had just about achieved immunity. And the iPad's lack of a hardware back-button still makes me swear ten times a day, another design decision taken for the sake of elegance over utility. (I'll probably get challenged to a duel for saying that). Judging by appearance is here to stay and manufacturers know it, leaving them with only two choices: either get really good at giving us what we didn't know we wanted, like Ives, or else let us customise to our eyes content.
I just upgraded my mobile phone from a ZTE San Francisco to a ZTE San Francisco 2 and it's a significant improvement. It has a faster processor, a far better camera with flash, and it runs a later version of Android. But most important of all, it doesn't have those two chrome strips down either side. "Er, is he going soft in the head"? you may be thinking. Well no, I don't actually give a damn about those chrome strips, but most of the online reviews of this phone I've read mentioned them in their first paragraph. It appears (geddit), that people are becoming as obsessed with the appearance of their gadgets as they already are with their haircuts, clothes, cars and sofas. And it's not only physical gadgets like phones but also software interfaces. I'd love to write a smug, judgmental column that argues everyone else is obsessed with appearances whereas I just don't care about such trivia, only the deepest essences of things. I'd love to, but in all honesty I can't, because I'm no better than anyone else in this respect. I don't give a damn about those particular chrome strips, but I'm fanatical about software user interfaces and have dumped many perfectly functional utilities because I couldn't stand the cut (or colour scheme) of their jib.
This phenomenon is not of course confined to the IT business. Reflect for a moment about the explosion of interest in all areas of design - from consumer goods to architecture and engineering - over the last three decades or so. Apple design guru Jonathon Ives was just knighted, while architects nowadays have the celebrity status of movie stars. This is a real social phenomenon, and it's of far more than just sociological interest because its economic consequences are profound. How many new car models failed because a consensus emerged that they looked awful (and I don't JUST mean the Sinclair C5, that's a lazy choice: there's also the BL Mini Metro, especially in that unique poo-brown paintjob). The plain fact is that everyone's a critic and aesthete nowadays, with major consequences for industries (both consumer and technical) that can hardly be overestimated. If you produce something that potential users find ugly you're in big trouble, and in areas like computer or phone operating systems, where development budgets run into the billions, that can matter a great deal. Which explains the almost comically paranoid behaviour of certain big IT companies, because some of the design decisions involved are now too big for mere mortals to make without going a little bit mad.
Two of these terrible quandaries are examined by RWC columnists in this very issue. Jon Honeyball writes about Microsoft's dithering over the look-and-feel of Windows 8, which is approaching Hamlet-like proportions. Redmond chickened-out from incorporating the final look into the Release Candidate build and Jon suspects this is because they're panicking, still trying out different tie-and-handkerchief combinations on secret focus groups. Locked in a death struggle with Apple's iPad, the stakes are too high to get it wrong, but the decision is too big for anyone's sanity. We do know that they've dumped the "Aero-glass" theme for window borders they so proudly introduced with Windows Vista, describing it as now "dated and cheesy" and certainly not "en vogue". (Interpreted, that means we're terrified that YOU think it's cheesy, and we want to get our capitulation in before your attack). Actually I like the Aero look, as indeed I like cheese, but there's a certain grim irony in this situation because it was Microsoft who started the whole trend 20-years ago, fussing over the look-and-feel of early Windows versions, being first to hire big-bucks graphic designers and useability teams.
Meanwhile in his column Simon Jones describes a user revolt among programmers over the colour-scheme in Visual Studio 11 Beta. Its designers decided to remove most colour from its user interface, substituting small indecipherable monochrome icons and menu options in ALL CAPITALS. I'm hardly surprised developers are on the warpath. Programming is the worst area (except perhaps for writing) to radically fiddle with user interfaces: those hypnotically repetitive loops of edit, compile, run, edit, compile, run are only made tolerable because you've totally internalised the position of every single button and option, so your fingers run on autopilot without conscious intervention. Upset that rhythm and productivity may be ruined for months until you've internalised the new set. The designers may have been right and that too much colour was distracting - doesn't matter when people are adapted to that distraction.
For similar reasons I personally loathe Facebook's imposition of the new Timeline, which depresses me because my profile is now 34 feet long and extends below the floorboards. I've always hated Facebook's interface anyway, but had just about achieved immunity. And the iPad's lack of a hardware back-button still makes me swear ten times a day, another design decision taken for the sake of elegance over utility. (I'll probably get challenged to a duel for saying that). Judging by appearance is here to stay and manufacturers know it, leaving them with only two choices: either get really good at giving us what we didn't know we wanted, like Ives, or else let us customise to our eyes content.
Thursday, 15 November 2012
UNDER THE OLD WHIFFLETREE
Dick Pountain/PC Pro/Idealog 214 11/05/2012
It all started when I was asked to write a preface for a new book on the history of Dennis Publishing, which required reminiscing about our start in the early 1970s. That triggered memories of the way we put magazines together back then: type the copy on an IBM Selectric "golfball" composer, cut it up into strips with scalpels and stick it down on the page with hot wax. The smell of that hot wax and the machine-gun rattle of the IBM came flooding back.
That prompted me to look up IBM Selectric on this new fangled Web thingy, where I soon stumbled across a neat little video clip by engineer Bill Hammack (http://www.up-video.com/v/57042,ibm-selectric-typewriter-.html) which shows how that unforgettable sound arose, but more importantly explains that the IBM golfball mechanism contained a fiendishly cunning example of a mechanical digital-to-analog converter. The problem that needed solving was to rotate an almost spherical print-head around two different axes, to position the correct character over the paper - unlike older typewriters, this print-head moved while the paper stood still (as in all modern computer printers which it foreshadowed). Rotation control involved adding together two digital "signals", using four bits to specify the tilt and 22 bits to specify rotation around the vertical axis, which originated as depressions of keys on the keyboard and were then transmitted via cables like those used to change gears on a bicycle. The mechanism that performed this addition went by the glorious name of a "whiffletree" (or whippletree). Now I was hooked.
Googling for whiffletree produced a total surprise. This mechanism has been known since at least the Middle Ages, perhaps even in the ancient world, as a method for harnessing horses to a plough! It solves the problem of various horses pulling with different strengths, by adding together and averaging their pulls onto the plough. It's a "tree" in exactly the same way a directory tree is: each *pair* of horses is harnessed to a horizontal wooden bar, then all these bars get connected to a larger bar and so on (a big team might require three levels). The pivot links between bars can be put into one of several of holes to "program" the whiffletree's addition sum: if the lead horse is pulling twice as hard as the others, put its pivot at the two-thirds mark. Without a diagram it's hard to convey just how damned elegant this mechanism is.
As an aside, at this point I ought to tell you that my first ever brush with computing happened in the sixth-form at school in 1961, as part of a team that built an electronic analog computer from RAF surplus radar components to enter a county prize competition. It could solve sixth-order differential equations in real-time (for instance to emulate the swing of pendulum that travels partially through oil) and we programmed it by plugging cables into a patch-panel, like an old-fashioned synthesiser or telephone switchboard.
In thrall to the whiffletree, I wondered where else such ingenious devices have been used, and that lead me straight to Naval gunnery controllers. Throughout WWII and right up into the 1970s, American warships were fitted with electro-mechanical fire control systems that worked on a principle not unlike the IBM Golfball. An enemy plane is approaching, your radar/sonar system is telling you from which direction, keep the anti-aircraft gun pointed in such directions that its stream of shells intercepts the moving plane's path. This problem was solved continuously in real-time, by gears, levers, cables and a few valves.
Ever since Alan Turing's 1936 seminal paper we've known that digital computers can imitate anything such mechanical or electrical analog devices can do, but sometimes there's little advantage in doing so. We used to be surrounded by simple analog computers, especially in our cars, and still are to a lesser extent. One that's long gone was the carburettor, which slid needles of varying taper through nozzles to compute a ferociously complex function relating petrol/air ratio to engine load. One that remains is the camshaft, whose varying cam profiles compute a similar function to control valve timing. A less obvious one is the humble wind-screen wiper, whose blade is actually attached via a whiffletree to spread the torque from the motor evenly along its length.
Just as my analog nostalgia was starting to wane, I turned on BBC 4 last night and watched a documentary about the Antikythera mechanism, an enigmatic bronze device of ancient Greek origin that was found on the sea-bed by pearl divers in 1900. Over fifty years of scientific investigation have revealed that this was a mechanical analog computer, almost certainly designed by Archimedes himself, whose rear face accurately calculated the dates of future solar and lunar eclipses, and front face was an animated display of the then-known-planets' orbits around the sun. It worked using around 70 hand-cut bronze gears with up to 253 teeth each. We're constantly tempted toward hubris concerning our extraordinary recent advances in digital technology, but once you've allowed for some four hundred years of cumulative advances in chemistry and solid-state physics, it ought to be quite clear that those ancient Greeks possessed every bit as much sheer human ingenuity as we do. And look what happened to them...
It all started when I was asked to write a preface for a new book on the history of Dennis Publishing, which required reminiscing about our start in the early 1970s. That triggered memories of the way we put magazines together back then: type the copy on an IBM Selectric "golfball" composer, cut it up into strips with scalpels and stick it down on the page with hot wax. The smell of that hot wax and the machine-gun rattle of the IBM came flooding back.
That prompted me to look up IBM Selectric on this new fangled Web thingy, where I soon stumbled across a neat little video clip by engineer Bill Hammack (http://www.up-video.com/v/57042,ibm-selectric-typewriter-.html) which shows how that unforgettable sound arose, but more importantly explains that the IBM golfball mechanism contained a fiendishly cunning example of a mechanical digital-to-analog converter. The problem that needed solving was to rotate an almost spherical print-head around two different axes, to position the correct character over the paper - unlike older typewriters, this print-head moved while the paper stood still (as in all modern computer printers which it foreshadowed). Rotation control involved adding together two digital "signals", using four bits to specify the tilt and 22 bits to specify rotation around the vertical axis, which originated as depressions of keys on the keyboard and were then transmitted via cables like those used to change gears on a bicycle. The mechanism that performed this addition went by the glorious name of a "whiffletree" (or whippletree). Now I was hooked.
Googling for whiffletree produced a total surprise. This mechanism has been known since at least the Middle Ages, perhaps even in the ancient world, as a method for harnessing horses to a plough! It solves the problem of various horses pulling with different strengths, by adding together and averaging their pulls onto the plough. It's a "tree" in exactly the same way a directory tree is: each *pair* of horses is harnessed to a horizontal wooden bar, then all these bars get connected to a larger bar and so on (a big team might require three levels). The pivot links between bars can be put into one of several of holes to "program" the whiffletree's addition sum: if the lead horse is pulling twice as hard as the others, put its pivot at the two-thirds mark. Without a diagram it's hard to convey just how damned elegant this mechanism is.
As an aside, at this point I ought to tell you that my first ever brush with computing happened in the sixth-form at school in 1961, as part of a team that built an electronic analog computer from RAF surplus radar components to enter a county prize competition. It could solve sixth-order differential equations in real-time (for instance to emulate the swing of pendulum that travels partially through oil) and we programmed it by plugging cables into a patch-panel, like an old-fashioned synthesiser or telephone switchboard.
In thrall to the whiffletree, I wondered where else such ingenious devices have been used, and that lead me straight to Naval gunnery controllers. Throughout WWII and right up into the 1970s, American warships were fitted with electro-mechanical fire control systems that worked on a principle not unlike the IBM Golfball. An enemy plane is approaching, your radar/sonar system is telling you from which direction, keep the anti-aircraft gun pointed in such directions that its stream of shells intercepts the moving plane's path. This problem was solved continuously in real-time, by gears, levers, cables and a few valves.
Ever since Alan Turing's 1936 seminal paper we've known that digital computers can imitate anything such mechanical or electrical analog devices can do, but sometimes there's little advantage in doing so. We used to be surrounded by simple analog computers, especially in our cars, and still are to a lesser extent. One that's long gone was the carburettor, which slid needles of varying taper through nozzles to compute a ferociously complex function relating petrol/air ratio to engine load. One that remains is the camshaft, whose varying cam profiles compute a similar function to control valve timing. A less obvious one is the humble wind-screen wiper, whose blade is actually attached via a whiffletree to spread the torque from the motor evenly along its length.
Just as my analog nostalgia was starting to wane, I turned on BBC 4 last night and watched a documentary about the Antikythera mechanism, an enigmatic bronze device of ancient Greek origin that was found on the sea-bed by pearl divers in 1900. Over fifty years of scientific investigation have revealed that this was a mechanical analog computer, almost certainly designed by Archimedes himself, whose rear face accurately calculated the dates of future solar and lunar eclipses, and front face was an animated display of the then-known-planets' orbits around the sun. It worked using around 70 hand-cut bronze gears with up to 253 teeth each. We're constantly tempted toward hubris concerning our extraordinary recent advances in digital technology, but once you've allowed for some four hundred years of cumulative advances in chemistry and solid-state physics, it ought to be quite clear that those ancient Greeks possessed every bit as much sheer human ingenuity as we do. And look what happened to them...
PUBLISH AND BE DROWNED
Dick Pountain/PC Pro/Idealog 213: 12/04/2012
A couple of years ago I became quite keen on the idea of publishing my own book on the Web. I got as far as opening PayPal and Google Checkout accounts and setting up a dummy download page on my website to see whether their payment widgets worked. In the end I didn't proceed because I came to realise that though publishing myself minimised costs (no trees need die, no publisher's share taken), the chance of my rather arcane volume becoming visible amid the Babel of the internet also hovered around zero, even if I devoted much of my time to tweaking and twiddling and AdSensing. What's more the internet is so price-resistant that charging even something reasonable like £2 was likely to deter all-comers. But perhaps the real cause of my retreat was that not having a tangible book just felt plain wrong. It's possible I'll try again via the Kindle Store, but I feel no great urgency.
I'm not alone in this lack of enthusiasm: the fact that mainstream book publishers still vastly overcharge for their e-books suggests their commitment is equally tepid (I recently bought Pat Churchland's "Braintrust" in print for £1 less than the Kindle edition). I'm well versed in Information Theory and fully understand that virtual and paper editions have identical information content but, as George Soros reminded us again recently, economics isn't a science and economic actors are not wholly rational. The paper version of a book just is worth more to me than the e-version, both as a reader and as an author. I really don't want to pay more than £1 for an e-book, but I also don't want to write a book that sells for only £1, and that's all there is to it.
As Tom Arah ruefully explains in his Web Applications column this month, the ideal of a Web where everyone becomes their own author is moving further away rather than nearer, as Adobe dumps mobile Flash after Microsoft fails to support it in Windows 8 Metro. It's precisely the sort of contradictory thinking that afflicts me that helped firms like Apple monopolise Web content by corralling everything through its walled-garden gate. The Web certainly does enable people to post their own works, in much the same way as the Sahara Desert enables people to erect their own statues: what's the use dragging them across the dunes if no-one can find them.
There's always a chance your work will go viral of course but only if it's the right sort of work, preferably with a cat in it (in this sense nothing much has changed since Alan Coren's merciless 1976 parody of the paperback market "Golfing for Cats" - with a swastika on the cover). The truth is that the internet turns out to be a phenomenally efficient way to organise meaningless data, but if you're bothered about meaning or critical judgement it's not nearly so hot (whatever happened to the Semantic Web?) This has nothing to do with taste or intelligence but is a purely structural property of the way links work. All the political blogs I follow display long lists of links to other recommended blogs, but the overlap between these is almost zero and the result is total overload. I regularly contribute to the Guardian's "Comment Is Free" forums but hate that they offer no route for horizontal communication between different articles on related topics. Electronic media invariably create trunk/branch/twig hierarchies where everyone ends up stuck on their own twig.
If the Web has a structural tendency to individualise and atomise, this can be counteracted by institutions like forums and groups that pull humans back together again to share critical judgments. Writing a novel or a poem may best be done alone, but publishing a magazine requires the coordinated efforts and opinions of a whole group of people. A musician *can* now create professional results on their own in the back-bedroom, but they might have more fun and play better on a stage, or in a studio, with other people. The success of a site like Stumblr shows that people are desperate for anything that can filter and concentrate the stuff they like out from the great flux of nonsense that the Web is becoming. The great virtue of sites like Flickr and SoundCloud is that they offer a platform on which to display your efforts before a selected audience of people with similar interests, who are willing and able to judge your work. Merely connecting people together is not enough.
The billion dollars Facebook just paid for Instagram perhaps doesn't look so outrageous once you understand that it wasn't really technology but savvy and enthusiastic users - the sort Facebook wishes it was creating but isn't because it's too big, too baggy and too unorganised - that it was purchasing. It will be interesting to see how their enthusiasm survives the takeover. The Web is a potent force for democratising and levelling, but it's far from clear yet how far that's compatible with discovering and nurturing unevenly-distributed talent. If publishers have a future at all, it lies in learning to apply such skills as they have in that department to the raging torrent of content.
A couple of years ago I became quite keen on the idea of publishing my own book on the Web. I got as far as opening PayPal and Google Checkout accounts and setting up a dummy download page on my website to see whether their payment widgets worked. In the end I didn't proceed because I came to realise that though publishing myself minimised costs (no trees need die, no publisher's share taken), the chance of my rather arcane volume becoming visible amid the Babel of the internet also hovered around zero, even if I devoted much of my time to tweaking and twiddling and AdSensing. What's more the internet is so price-resistant that charging even something reasonable like £2 was likely to deter all-comers. But perhaps the real cause of my retreat was that not having a tangible book just felt plain wrong. It's possible I'll try again via the Kindle Store, but I feel no great urgency.
I'm not alone in this lack of enthusiasm: the fact that mainstream book publishers still vastly overcharge for their e-books suggests their commitment is equally tepid (I recently bought Pat Churchland's "Braintrust" in print for £1 less than the Kindle edition). I'm well versed in Information Theory and fully understand that virtual and paper editions have identical information content but, as George Soros reminded us again recently, economics isn't a science and economic actors are not wholly rational. The paper version of a book just is worth more to me than the e-version, both as a reader and as an author. I really don't want to pay more than £1 for an e-book, but I also don't want to write a book that sells for only £1, and that's all there is to it.
As Tom Arah ruefully explains in his Web Applications column this month, the ideal of a Web where everyone becomes their own author is moving further away rather than nearer, as Adobe dumps mobile Flash after Microsoft fails to support it in Windows 8 Metro. It's precisely the sort of contradictory thinking that afflicts me that helped firms like Apple monopolise Web content by corralling everything through its walled-garden gate. The Web certainly does enable people to post their own works, in much the same way as the Sahara Desert enables people to erect their own statues: what's the use dragging them across the dunes if no-one can find them.
There's always a chance your work will go viral of course but only if it's the right sort of work, preferably with a cat in it (in this sense nothing much has changed since Alan Coren's merciless 1976 parody of the paperback market "Golfing for Cats" - with a swastika on the cover). The truth is that the internet turns out to be a phenomenally efficient way to organise meaningless data, but if you're bothered about meaning or critical judgement it's not nearly so hot (whatever happened to the Semantic Web?) This has nothing to do with taste or intelligence but is a purely structural property of the way links work. All the political blogs I follow display long lists of links to other recommended blogs, but the overlap between these is almost zero and the result is total overload. I regularly contribute to the Guardian's "Comment Is Free" forums but hate that they offer no route for horizontal communication between different articles on related topics. Electronic media invariably create trunk/branch/twig hierarchies where everyone ends up stuck on their own twig.
If the Web has a structural tendency to individualise and atomise, this can be counteracted by institutions like forums and groups that pull humans back together again to share critical judgments. Writing a novel or a poem may best be done alone, but publishing a magazine requires the coordinated efforts and opinions of a whole group of people. A musician *can* now create professional results on their own in the back-bedroom, but they might have more fun and play better on a stage, or in a studio, with other people. The success of a site like Stumblr shows that people are desperate for anything that can filter and concentrate the stuff they like out from the great flux of nonsense that the Web is becoming. The great virtue of sites like Flickr and SoundCloud is that they offer a platform on which to display your efforts before a selected audience of people with similar interests, who are willing and able to judge your work. Merely connecting people together is not enough.
The billion dollars Facebook just paid for Instagram perhaps doesn't look so outrageous once you understand that it wasn't really technology but savvy and enthusiastic users - the sort Facebook wishes it was creating but isn't because it's too big, too baggy and too unorganised - that it was purchasing. It will be interesting to see how their enthusiasm survives the takeover. The Web is a potent force for democratising and levelling, but it's far from clear yet how far that's compatible with discovering and nurturing unevenly-distributed talent. If publishers have a future at all, it lies in learning to apply such skills as they have in that department to the raging torrent of content.
Subscribe to:
Posts (Atom)
POD PEOPLE
Dick Pountain /Idealog 366/ 05 Jan 2025 03:05 It’s January, when columnists feel obliged to reflect on the past year and who am I to refuse,...
-
Dick Pountain/Idealog 277/05 August 2017 11:05 I'm not overly prone to hero-worship, but that's not to say that I don't have a...
-
Dick Pountain /Idealog 360/ 07 Jul 2024 11:12 Astute readers <aside class=”smarm”> which of course to me means all of you </aside...
-
Dick Pountain /Idealog 363/ 05 Oct 2024 03:05 When I’m not writing this column, which let’s face it is most of the time, I perform a variety...