Dick Pountain/PC Pro/Idealog 227 05/06/2013
I've finally bitten the bullet and published my own Kindle book. I'd been thinking about it for several years, cutting out PC Pro features about how to convert into Amazon's MOBI format, but somehow never getting around to actually doing it. Editing Kevin Partner's Real World column "Online Business" over the last few months - where he describes setting up his own experimental Kindle publishing business - is what finally decided me it was something I must try, that wouldn't cost a lot of money.
I already had my content in the shape of a short book I've written called "Sampling Reality" which attempts to stitch together recent results in information theory, affective neuroscience (that is, the physiological basis of emotions) and cognitive psychology. As you can probably imagine, it's not a title likely to trouble the best-sellers list overmuch, and so far I'd only made it available via Scribd and my own website in PDF form. That meant that I already had it in a more-or-less publishable format: paginated, with chapter headings and subheads, a table of contents and a properly formatted bibliography. I'd done all that easily enough in Microsoft Word, using just two fonts - Times Roman for body and Arial for headings since you ask (I'm conservative that way, no hipster Futura or Helvetica). It looked quite nice and is quite easy to use, with working links from the contents page to chapters.
Book covers are one of Kevin's strongest recommendations: with so much stuff on Kindle you have just a fraction of a second to catch a browsing eye, so make it noticeable. I knocked up something I'm quite happy with using a montage of my own Flickr photos, and stretched it to fit in Word without needing to resort to any more sophisticated design software. Now I had a PDF with a full-colour cover, and that's what I expected to turn into a Kindle MOBI file.
I already knew, from Kevin and many other sources, that there's only one game in town for doing this conversion, a free program called Calibre written by Kovid Goyal. Calibre is hard-core multi-platform open source which you can compile yourself from github if you're that way inclined - I'm not and just downloaded a Windows version. It's far more than a file format converter, a complete content management system for your e-book collection. Its multi-platform roots show in a colourful GUI that conforms neither to Windows nor Mac guidelines, so the way it works will have you scratching your head at first. I'd already been through that hoop back in 2009 though when I discovered Calibre for converting public-domain PDF books to read on my Sony PRS-505 Reader.
Converting my PDF produced a total dog's breakfast: pagination well screwed with chapter headings halfway down pages; subheads indistinguishable from main text; contents page spread out with one-chapter-per-page, and its links didn't work. Most intriguingly of all, every single apostrophe in the book had been replaced by a little empty box. Apart from that it was fine. I hadn't understood before that MOBI only supports one font family per document, although it does permit bold, italic and various sizes. Bye-bye to my sans headings. I generated new PDFs with altered settings to no effect, then decided to dump PDF.
Calibre can't convert DOCX files directly so I tried outputting HTML: that paginated better, but contents still didn't work and my apos were still atrophied. Tried ODT, not so good. Finally I tried good old RTF and, phew, it all looked good with subheads even in bold and a working contents list, but still *those bloody apostrophies*. I hit the forums and found one tip, from Kovid himself, which said this happens when a Kindle doesn't have a font to display a particular character. I wasn't using a real Kindle but the Kindle client running on my PC. Kovid's tip suggests setting "transliterate unicode to ascii" in one of Calibre's many config files. That didn't work but it provoked me into getting medieval on the document's ass. I search-and-replaced every single goddamned apostrophe from Unicode character 0027 to 02B9 (a slightly smarter apostrophe), which made the unicode to ascii conversion work and I finally had a publishable file that passed Kindle's vetting stage without criticism and was up on Amazon within a day. Check out the result at http://www.amazon.co.uk/-/e/B001HPO2E2. One annoying thing about the way Kindle works is that if you want to edit an already published book, you have to delete and resubmit, with the accompanying 12-hour delay: there's no interactive editing. So when I noticed that the word "Contents" now occupies page 2 all on its own, I couldn't face fixing it. I will one day, soon, and that's a promise.
My columns for PC Pro magazine, posted here six months in arrears for copyright reasons
Sunday, 15 December 2013
Saturday, 2 November 2013
BEST OF BRITISH
Dick Pountain/PC Pro/Idealog 226/05/05/2013
A wave of nostalgia for the British home computer industry is upon us. It's mostly driven by games players who were schoolkids during that brief "golden half-decade" between 1980 and 1985, which is hardly that surprising given that so few of those UK-designed microcomputers were much use for anything else. Tony Smith has been running a highly entertaining series of memoirs about the Sinclair Spectrum, Lynx, Oric, Dragon, Jupiter Ace and more weird and wonderful devices at The Register website. Also emulators are available, written by selfless enthusiasts, to run all your old Spectrum games on a modern PC.
I'm an in-betweener, a crucial decade older than this Spectrum Kid demographic, so my own retro-spectacles are tinted rather less than rosy (indeed, closer to pale blue). To be sure I must thank the home computer boom for my present career, having entered the magazine business on the crest of it, but the machine I actually took home was a Sharp MZ80B running CP/M 3.4, on which I wrote my first book and learned Pascal, Forth and Lisp. To me computers were already serious tools rather than toys. In *my* schooldays I'd helped build an analog computer out of ex-RAF radar parts, and as a biochemistry student in the '60s I'd used London University's solitary Atlas mainframe to process my scintillation counter readings.
A month or so ago I had tea with Andy Hopper, Cambridge Professor of Computer Technology, President of the IET (Institution of Engineering and Technology) and for many years head of the Cambridge Computer Laboratory. We talked about Britain's role in the history of computing, and after our chat Professor Hopper sent me a copy of a lavishly illustrated new book celebrating the first 75 years of the Cambridge Computer Laboratory. I found myself quite transfixed by it, because although I sort-of-knew many of the facts it contains, I'd assembled the complete story of UK computing in my mind before, and the evocative B&W photos of the principal actors helped too.
The story of course starts in the 1840s with Charles Babbage's ill-fated attempts to build his mechanical Difference Engine - which now has a happy ending thanks to the superb working version now on display at the Science Museum. It carries on with Alan Turing, Bletchley Park, Colossus versus Enigma, a story now sufficiently familiar to make it onto TV and Hollywood movies. But there's another, less known story running in parallel with these landmarks. What's the working material that all modern computers manipulate? Electrons, as discovered by J.J. Thompson at the Cavendish Laboratory in 1897. In 1926, also at the Cavendish, brilliant radio engineer Charles Eryl Wynn-Williams invented a "scale-of-two" counter for an early radiation detector, which was the prototype of all digital devices.
The concept of the digital computer itself comes from Turing's 1936 paper on Computable Numbers. After WWII, with the triumphs of Bletchley still top-secret, British computer scientists found themselves in a neck-and-neck race with their US equivalents. Though Eckert and Mauchly's 1946 ENIAC is credited as the first working stored-program digital computer, the first practical one was EDSAC, built at the Cambridge computer lab by Maurice Wilkes's team in 1949. Made available to other university departments, its calculations contributed to several Nobel prizes including Richard Stone's for economics, John Kendrew's for the structure of myoglobin, and Martin Ryle's for radio astronomy. EDSAC's design also pioneered a bunch of crucial innovations still in use, including the subroutine, microcoded instructions and bit-sliced processor architectures.
But we lost the race because British scientists just don't have the Yanks' business acumen, right? Er, no. Wilkes was approached very early, in 1949, by the catering company J Lyons (of the Lyons Corner House cafe chain) to licence EDSAC as a model for the world's first commercially-useful business computer called LEO. Wilkes was quite adept at technology transfer and used Lyons' money to build EDSAC 2, while Lyons sold a range of three successive LEO models successfully until 1963, when taken over by English Electric (and later merged into ICL).
This story continues through the 1970s Cambridge Ring pioneering network project, which eventually lost out to Ethernet; to Acorn Computers and the BBC Micro; culminating with the formation of ARM Ltd as a joint venture with Apple (for the Newton PDA) which eventually saw ARM-designed CPUs driving the iPhone, iPad and most of the world's mobiles. We're often lectured nowadays that Britain's poor overall industrial performance is due to too much public and not enough private enterprise, but what this story says is that what our computing successes share with Silicon Valley's is that they're all started by scientists and engineers who understand the product (think Gordon Moore) rather than money men who don't.
A wave of nostalgia for the British home computer industry is upon us. It's mostly driven by games players who were schoolkids during that brief "golden half-decade" between 1980 and 1985, which is hardly that surprising given that so few of those UK-designed microcomputers were much use for anything else. Tony Smith has been running a highly entertaining series of memoirs about the Sinclair Spectrum, Lynx, Oric, Dragon, Jupiter Ace and more weird and wonderful devices at The Register website. Also emulators are available, written by selfless enthusiasts, to run all your old Spectrum games on a modern PC.
I'm an in-betweener, a crucial decade older than this Spectrum Kid demographic, so my own retro-spectacles are tinted rather less than rosy (indeed, closer to pale blue). To be sure I must thank the home computer boom for my present career, having entered the magazine business on the crest of it, but the machine I actually took home was a Sharp MZ80B running CP/M 3.4, on which I wrote my first book and learned Pascal, Forth and Lisp. To me computers were already serious tools rather than toys. In *my* schooldays I'd helped build an analog computer out of ex-RAF radar parts, and as a biochemistry student in the '60s I'd used London University's solitary Atlas mainframe to process my scintillation counter readings.
A month or so ago I had tea with Andy Hopper, Cambridge Professor of Computer Technology, President of the IET (Institution of Engineering and Technology) and for many years head of the Cambridge Computer Laboratory. We talked about Britain's role in the history of computing, and after our chat Professor Hopper sent me a copy of a lavishly illustrated new book celebrating the first 75 years of the Cambridge Computer Laboratory. I found myself quite transfixed by it, because although I sort-of-knew many of the facts it contains, I'd assembled the complete story of UK computing in my mind before, and the evocative B&W photos of the principal actors helped too.
The story of course starts in the 1840s with Charles Babbage's ill-fated attempts to build his mechanical Difference Engine - which now has a happy ending thanks to the superb working version now on display at the Science Museum. It carries on with Alan Turing, Bletchley Park, Colossus versus Enigma, a story now sufficiently familiar to make it onto TV and Hollywood movies. But there's another, less known story running in parallel with these landmarks. What's the working material that all modern computers manipulate? Electrons, as discovered by J.J. Thompson at the Cavendish Laboratory in 1897. In 1926, also at the Cavendish, brilliant radio engineer Charles Eryl Wynn-Williams invented a "scale-of-two" counter for an early radiation detector, which was the prototype of all digital devices.
The concept of the digital computer itself comes from Turing's 1936 paper on Computable Numbers. After WWII, with the triumphs of Bletchley still top-secret, British computer scientists found themselves in a neck-and-neck race with their US equivalents. Though Eckert and Mauchly's 1946 ENIAC is credited as the first working stored-program digital computer, the first practical one was EDSAC, built at the Cambridge computer lab by Maurice Wilkes's team in 1949. Made available to other university departments, its calculations contributed to several Nobel prizes including Richard Stone's for economics, John Kendrew's for the structure of myoglobin, and Martin Ryle's for radio astronomy. EDSAC's design also pioneered a bunch of crucial innovations still in use, including the subroutine, microcoded instructions and bit-sliced processor architectures.
But we lost the race because British scientists just don't have the Yanks' business acumen, right? Er, no. Wilkes was approached very early, in 1949, by the catering company J Lyons (of the Lyons Corner House cafe chain) to licence EDSAC as a model for the world's first commercially-useful business computer called LEO. Wilkes was quite adept at technology transfer and used Lyons' money to build EDSAC 2, while Lyons sold a range of three successive LEO models successfully until 1963, when taken over by English Electric (and later merged into ICL).
This story continues through the 1970s Cambridge Ring pioneering network project, which eventually lost out to Ethernet; to Acorn Computers and the BBC Micro; culminating with the formation of ARM Ltd as a joint venture with Apple (for the Newton PDA) which eventually saw ARM-designed CPUs driving the iPhone, iPad and most of the world's mobiles. We're often lectured nowadays that Britain's poor overall industrial performance is due to too much public and not enough private enterprise, but what this story says is that what our computing successes share with Silicon Valley's is that they're all started by scientists and engineers who understand the product (think Gordon Moore) rather than money men who don't.
GAME OF PHONES
Dick Pountain/PC Pro/Idealog 225 09/04/2013
When we look back at the Industrial Revolution of the 18th century we tend to focus on the steam engine and the railway, and when 22nd-century historians look back at the Digital Revolution that began in the late 20th it won't be the personal computer they single out but the microprocessor, the internet and the mobile telephone. Microprocessors supply processing power to increasingly intelligent devices, and the PC will be seen as a quite brief but crucial phase in the evolution of the microprocessor until it got small enough to fit inside a smartphone or tablet. The rise of the PC was spectacular enough, 1.5 billion over 30 years, but the mobile phone reached 6 billion in around 20.
The reason four times as many people use mobiles as PCs isn't hard to fathom. Most people have little need for calculation per se in their daily lives, but communicating with other people, locating and consuming information are absolutely central. And while both PC and smartphone can do that, a smartphone can do it from your pocket and in the middle of a field. There'll always be professionals who need spreadsheets and word processors, but almost everyone has a use for email, SMS, social networks and Google Maps. What's more mobile phone masts can be erected even in parts of the world that will never get a wired internet and phone network. From Mongolia to the Maasai Mara, farmers and herders deal direct by mobile and cut out parasitic middlemen, cab drivers find their destination without years of study, engineers no longer need carry bulky manuals.
Control of the mobile internet is set to become the hottest of all political issues, in a way that control over the PC never quite was. To be sure there was a period at the very end of the Cold War when the US government tried to deny the Soviet Union access to the latest microprocessors via CoCom, but that apart it's been market forces all the way. And since CoCom ceased around 1994 the world has become a very different place. A handful of giant internet corporations - Facebook, Google, Amazon, Apple, Yahoo, Twitter and the rest - now have annual revenues comparable to those of sovereign states, plus direct access to the hearts and minds of vast swathes of the population that governments can only dream of.
It's no coincidence that every week now brings a new rumour that some corporation, like Facebook or Amazon, is developing its own mobile phone. Everyone seems to be thinking about owning the phone or tablet and "forking" Android to run it in their own special way. Most users are not techies and don't want to be techies, so if you can sell them a branded phone with your logo and your UI-veneer on it, that's all they'll ever see. (Rooting and tweaking are strictly for a tiny, nerdy minority). There's great power to be had there, and great revenues too because unlike the silly old Web, mobile networks remembered to build-in a payment mechanism! Actually an outfit the size of Facebook is so ubiquitous it doesn't need to own the phone hardware: getting its app onto everyone's phone (of whatever brand) would be enough if it offered Skype-style voice-over-IP calls and messaging, which would start to eat the lunch of the mobile operators themselves as well as competiing social networks.
Katherine Losse was a pioneer Facebook employee who used to ghost-write posts for Mark Zuckerberg himself, and in her recent book "The Boy Kings" she offers a disturbing picture of his thinking. The main points of his credo include youthfulness, openness, sharing power and "companies over countries". Asked what he meant by the latter he told her "it means that the best thing to do now, if you want to change the world, is to start a company. It’s the best model for getting things done and bringing your vision to the world." So the model for a new world is the Californian youth-oriented corporation, untramelled by pesky laws and regulations, by messy old-world stuff like pensions and having to win elections. The Nation State is just plain out-of-date, it still practices stupid stuff like secrecy and taxation, it doesn't get the New Digital Narcissism where everyone can be an (unpaid) star of their own channel. All rather reminiscent the 1960s counterculture mixed with a dash of Orwell's Oceania, Eurasia, EastAsia. But actually it starts to look rather like a new variation on feudalism where you'll only get fed if you become a retainer of one of these mega-corporations, as the boring old centralised state and its services wither away.
When we look back at the Industrial Revolution of the 18th century we tend to focus on the steam engine and the railway, and when 22nd-century historians look back at the Digital Revolution that began in the late 20th it won't be the personal computer they single out but the microprocessor, the internet and the mobile telephone. Microprocessors supply processing power to increasingly intelligent devices, and the PC will be seen as a quite brief but crucial phase in the evolution of the microprocessor until it got small enough to fit inside a smartphone or tablet. The rise of the PC was spectacular enough, 1.5 billion over 30 years, but the mobile phone reached 6 billion in around 20.
The reason four times as many people use mobiles as PCs isn't hard to fathom. Most people have little need for calculation per se in their daily lives, but communicating with other people, locating and consuming information are absolutely central. And while both PC and smartphone can do that, a smartphone can do it from your pocket and in the middle of a field. There'll always be professionals who need spreadsheets and word processors, but almost everyone has a use for email, SMS, social networks and Google Maps. What's more mobile phone masts can be erected even in parts of the world that will never get a wired internet and phone network. From Mongolia to the Maasai Mara, farmers and herders deal direct by mobile and cut out parasitic middlemen, cab drivers find their destination without years of study, engineers no longer need carry bulky manuals.
Control of the mobile internet is set to become the hottest of all political issues, in a way that control over the PC never quite was. To be sure there was a period at the very end of the Cold War when the US government tried to deny the Soviet Union access to the latest microprocessors via CoCom, but that apart it's been market forces all the way. And since CoCom ceased around 1994 the world has become a very different place. A handful of giant internet corporations - Facebook, Google, Amazon, Apple, Yahoo, Twitter and the rest - now have annual revenues comparable to those of sovereign states, plus direct access to the hearts and minds of vast swathes of the population that governments can only dream of.
It's no coincidence that every week now brings a new rumour that some corporation, like Facebook or Amazon, is developing its own mobile phone. Everyone seems to be thinking about owning the phone or tablet and "forking" Android to run it in their own special way. Most users are not techies and don't want to be techies, so if you can sell them a branded phone with your logo and your UI-veneer on it, that's all they'll ever see. (Rooting and tweaking are strictly for a tiny, nerdy minority). There's great power to be had there, and great revenues too because unlike the silly old Web, mobile networks remembered to build-in a payment mechanism! Actually an outfit the size of Facebook is so ubiquitous it doesn't need to own the phone hardware: getting its app onto everyone's phone (of whatever brand) would be enough if it offered Skype-style voice-over-IP calls and messaging, which would start to eat the lunch of the mobile operators themselves as well as competiing social networks.
Katherine Losse was a pioneer Facebook employee who used to ghost-write posts for Mark Zuckerberg himself, and in her recent book "The Boy Kings" she offers a disturbing picture of his thinking. The main points of his credo include youthfulness, openness, sharing power and "companies over countries". Asked what he meant by the latter he told her "it means that the best thing to do now, if you want to change the world, is to start a company. It’s the best model for getting things done and bringing your vision to the world." So the model for a new world is the Californian youth-oriented corporation, untramelled by pesky laws and regulations, by messy old-world stuff like pensions and having to win elections. The Nation State is just plain out-of-date, it still practices stupid stuff like secrecy and taxation, it doesn't get the New Digital Narcissism where everyone can be an (unpaid) star of their own channel. All rather reminiscent the 1960s counterculture mixed with a dash of Orwell's Oceania, Eurasia, EastAsia. But actually it starts to look rather like a new variation on feudalism where you'll only get fed if you become a retainer of one of these mega-corporations, as the boring old centralised state and its services wither away.
THE COMPANY STORE
Dick Pountain/PC Pro/Idealog 224 06/03/2013
I recently reviewed a interesting book, "Carbon Democracy" by Timothy Mitchell (Verso 2011), which analyses the effect of different energy sources on politics. Very brutally condensed, Mitchell argues that our political institutions are profoundly shaped by the types of *energy flow* we employ. A coal-based economy spawned the industrial revolution and the rise of mass democracies, while the displacement of coal by oil is tending to erode those democracies. Early humans consumed energy that came almost directly from the sun: photosynthesis provided plants for food and wood for dwellings and fire, and both hunter-gatherers and early cultivators consumed plants and animals close to where they lived, with no need for extensive transport networks. Coal changed all that by providing both the means to create, and the need for, a network of factories connected by railways, and the new social disciplines this enforced are those we still more or less live by.
Unlike coal, oil almost mines itself. It spouts to the surface under its own pressure, and although advanced technology is required to discover deposits and drill wells, the highly-skilled workers are few compared to coal miners, and remain above ground where they're easier to supervise and enjoy less autonomy. As a liquid, oil can be sent over vast distances via pipeline and tanker using little human labour, and its global distribution ensures that supplies can be diverted by a single phone call to neutralise a strike at one location. Hence the switch from coal to oil reduces the ability of labour to disrupt energy flows and hands that power instead to large oil companies, granting them the ability to threaten governments and dictate foreign policy (the post-WWII Marshall Plan was in part designed to switch Europe from coal to oil and introduce US-style industrial relations). I'm impressed by Mitchell's approach, which makes sense of a lot of stuff happening today, but I'm sure he wouldn't disagree if I say that it's just one layer of an explanation, and that adding a similar approach to *information flows* (means of communication) would be a valuable complement.
There have been shelf-loads of starry-eyed books about what the internet is going to mean for the future of human societies. Many imagine small rural communities of Hobbit houses, buried deep in the woods, living on home-baked spelt bread and organic beetroot soup while swapping kitten pictures with kindred spirits the world over on Facebook. There are a few grumpy dissenters from this fluffy view, notably Jonathon Meades who in "Isle of Rust" describes something structurally similar, but the real village on Lewis and Harris he visits is littered with rusting car chassis and its inhabitants dwell on the net as a way of completely ignoring their immediate environment. He imagines humans in 2113 revering the detritus as sacred objects from a distant pre-apocalyptic era when we still had oil and electricity.
And so to Microsoft's (and Adobe's, and Apple's) software licensing policies (which you might think rather a long leap). The more positive future models assume that, as a response to climate change, these Net-Hobbit communities in the woods will be fuelled by distributed renewable sources of energy and ruled by equally distributed libertarian social structures, a sort of cyber-anarchism. But what they're not? Mitchell's methodology suggests something more like a net-mediated feudalism, ruled over by a handful of giant corporations. Why? Because Apple, Microsoft, Google, Amazon are rushing toward a vision of *renting* rather than *selling* their services. You won't be able to own their software outright but will have to pay for it over again every month, as you do for electricity and gas, and perhaps music and other entertainment. A step backward to an economy where people earn money simply by owning stuff, rather than by investing and employing other people. Fail to pay and you'll get kicked out of the global village.
I recently had an email chat with our Online Business columnist Kevin Partner about the way Adobe will soon be wanting £50 a month for the graphics tools he relies on (he plans to buy some alternative before it's too late). This is not a new economic model, but rather one with a long and disreputable history. It's how Mississippi share croppers and Kentucky coal-miners used to live, owing more money to the company store for groceries than they ever earned, which ensured their continuing servitude. To salute this brave new vision I've taken the liberty of writing an updated lyric for Merle Travis's famous 1946 song "Sixteen Tons":
"You upload sixteen gigs and what do you get,
Another day older and deeper in debt,
St Peter don't you call me 'cos I can't go,
I owe my soul to the virtual store..."
I recently reviewed a interesting book, "Carbon Democracy" by Timothy Mitchell (Verso 2011), which analyses the effect of different energy sources on politics. Very brutally condensed, Mitchell argues that our political institutions are profoundly shaped by the types of *energy flow* we employ. A coal-based economy spawned the industrial revolution and the rise of mass democracies, while the displacement of coal by oil is tending to erode those democracies. Early humans consumed energy that came almost directly from the sun: photosynthesis provided plants for food and wood for dwellings and fire, and both hunter-gatherers and early cultivators consumed plants and animals close to where they lived, with no need for extensive transport networks. Coal changed all that by providing both the means to create, and the need for, a network of factories connected by railways, and the new social disciplines this enforced are those we still more or less live by.
Unlike coal, oil almost mines itself. It spouts to the surface under its own pressure, and although advanced technology is required to discover deposits and drill wells, the highly-skilled workers are few compared to coal miners, and remain above ground where they're easier to supervise and enjoy less autonomy. As a liquid, oil can be sent over vast distances via pipeline and tanker using little human labour, and its global distribution ensures that supplies can be diverted by a single phone call to neutralise a strike at one location. Hence the switch from coal to oil reduces the ability of labour to disrupt energy flows and hands that power instead to large oil companies, granting them the ability to threaten governments and dictate foreign policy (the post-WWII Marshall Plan was in part designed to switch Europe from coal to oil and introduce US-style industrial relations). I'm impressed by Mitchell's approach, which makes sense of a lot of stuff happening today, but I'm sure he wouldn't disagree if I say that it's just one layer of an explanation, and that adding a similar approach to *information flows* (means of communication) would be a valuable complement.
There have been shelf-loads of starry-eyed books about what the internet is going to mean for the future of human societies. Many imagine small rural communities of Hobbit houses, buried deep in the woods, living on home-baked spelt bread and organic beetroot soup while swapping kitten pictures with kindred spirits the world over on Facebook. There are a few grumpy dissenters from this fluffy view, notably Jonathon Meades who in "Isle of Rust" describes something structurally similar, but the real village on Lewis and Harris he visits is littered with rusting car chassis and its inhabitants dwell on the net as a way of completely ignoring their immediate environment. He imagines humans in 2113 revering the detritus as sacred objects from a distant pre-apocalyptic era when we still had oil and electricity.
And so to Microsoft's (and Adobe's, and Apple's) software licensing policies (which you might think rather a long leap). The more positive future models assume that, as a response to climate change, these Net-Hobbit communities in the woods will be fuelled by distributed renewable sources of energy and ruled by equally distributed libertarian social structures, a sort of cyber-anarchism. But what they're not? Mitchell's methodology suggests something more like a net-mediated feudalism, ruled over by a handful of giant corporations. Why? Because Apple, Microsoft, Google, Amazon are rushing toward a vision of *renting* rather than *selling* their services. You won't be able to own their software outright but will have to pay for it over again every month, as you do for electricity and gas, and perhaps music and other entertainment. A step backward to an economy where people earn money simply by owning stuff, rather than by investing and employing other people. Fail to pay and you'll get kicked out of the global village.
I recently had an email chat with our Online Business columnist Kevin Partner about the way Adobe will soon be wanting £50 a month for the graphics tools he relies on (he plans to buy some alternative before it's too late). This is not a new economic model, but rather one with a long and disreputable history. It's how Mississippi share croppers and Kentucky coal-miners used to live, owing more money to the company store for groceries than they ever earned, which ensured their continuing servitude. To salute this brave new vision I've taken the liberty of writing an updated lyric for Merle Travis's famous 1946 song "Sixteen Tons":
"You upload sixteen gigs and what do you get,
Another day older and deeper in debt,
St Peter don't you call me 'cos I can't go,
I owe my soul to the virtual store..."
Friday, 30 August 2013
TWO ALANS
Dick Pountain/PC Pro/(Idealog 223 - 06/02/2013)
Atari just filed for bankruptcy, which lead me toward a string of rather odd associations. I never was an Atari owner but the firm's demise jogged me into remembering that Alan Kay - that least-heralded inspirer of the personal computer revolution - worked for the firm immediately after leaving his epoch-making stint at Xerox PARC. While at PARC Kay's team pioneered just about everything we now take for granted including object-oriented programs, windowed graphical user interfaces and local area networks, although Xerox notoriously failed to capitalise on this work (as did Atari). It wasn't until Steven Jobs' equally notorious PARC visit that Apple picked up the baton, prompting Kay to move there straight from Atari and remain as a Research Fellow for the next 13 years.
My other Alan-named hero, "father of computing" Alan Turing may soon have a Hollywood film made of his life (as Steve Jobs has). Indeed, after being ignored in embarrassed silence for over 50 years, Turing now has three films: two for British TV, "Breaking the Code" with Derek Jacobi playing him and the rather better "Codebreaker" with Ed Stoppard, while the forthcoming Hollywood biopic "The Imitation Game" may star either Leonardo di Caprio or Benedict Cumberbatch. Alan Kay on the other hand is still alive, nowadays running his own non-profit institute that studies new ideas in educational computing, and what he'll eventually be remembered for (perhaps even with a biopic) is his dream of the Dynabook.
Conceived around 1968, this was to be a revolutionary keyboardless, wireless tablet computer permanently connected to an online library containing all the world's accumulated knowledge, and he wanted every child in the world to have access to one. Kay of course could never actually build a Dynabook because at the time PARC was developing graphical UIs there weren't even any microprocessors, let alone an internet. (His Xerox Star "personal" workstation was a fridge-sized minicomputer built with discrete logic chips). But the fact is that everything required hardwarewise to implement his idea now exists.
In fact my little Nexus 7 already comes pretty close to Kay's ideal, delivering information through Google search and Wikipedia, music and movies via Spotify, YouTube and other places. What it lacks is universality, and that's no longer a technical but a commercial problem, over ownership of both conduit and content and how much we're prepared to pay for them. A Dynabook was supposed to operate wirelessly from anywhere at all, which rules out my Nexus because it's Wi-Fi only - but even were I to buy a 3G tablet and SIM the cost of the data-plan might inhibit me from using it as freely as I do at home. Running a tablet off my home broadband is one thing, but I can't justify an extra account just for outside use. It's the same story with content: I already pay £10 per month to Spotify, justified because music is my main amusement, but similar subscriptions to Netflix or LoveFilm don't tempt me because they don't deal in films I want. If YouTube started charging (as it recently threatened) £5 a month for access I'd probably give that up too. For Kay's dream to finally come true we desperately need to rationalise both conduit and content licensing to make ubiquitous data access affordable.
Even here, all the necessary hardware and software tricks are already well known. Mobile phone companies have invested mega-billions to cover much of the globe with masts, supporting networks which, unlike the internet, already have an ability to charge for traffic built right in. A global system that mimiced those hierarchical storage systems already employed in enterprise-level computing could be assembled, using the internet as the trunk of a global tree, switching down onto mobile networks for its branches and finally onto village/street level Wi-Fi for its twigs. The problems aren't technically insuperable but colossal efforts of business diplomacy would be required to negotiate equitable distribution of revenues to the owners of these various "wires". In fact the problem facing the ITU would resemble the problem the UN faces in trying to broker world peace, but we're talking dreams here and my point is that the obstacle is no longer hardware.
The same would be true for collecting royalties on content, though Spotify, Netflix and the rest show that this can be done (with difficulty). In fact there's a strong case for heavily subsidising the conduits and charging mostly for the content, along the same historical lines followed by roads and railways. I believe that both of my Alan heroes would agree that a species which can't provide affordable information to educate its offspring, while loudly fantasising about colonising Mars and mining diamonds from asteroids, has pretty weird priorities.
Atari just filed for bankruptcy, which lead me toward a string of rather odd associations. I never was an Atari owner but the firm's demise jogged me into remembering that Alan Kay - that least-heralded inspirer of the personal computer revolution - worked for the firm immediately after leaving his epoch-making stint at Xerox PARC. While at PARC Kay's team pioneered just about everything we now take for granted including object-oriented programs, windowed graphical user interfaces and local area networks, although Xerox notoriously failed to capitalise on this work (as did Atari). It wasn't until Steven Jobs' equally notorious PARC visit that Apple picked up the baton, prompting Kay to move there straight from Atari and remain as a Research Fellow for the next 13 years.
My other Alan-named hero, "father of computing" Alan Turing may soon have a Hollywood film made of his life (as Steve Jobs has). Indeed, after being ignored in embarrassed silence for over 50 years, Turing now has three films: two for British TV, "Breaking the Code" with Derek Jacobi playing him and the rather better "Codebreaker" with Ed Stoppard, while the forthcoming Hollywood biopic "The Imitation Game" may star either Leonardo di Caprio or Benedict Cumberbatch. Alan Kay on the other hand is still alive, nowadays running his own non-profit institute that studies new ideas in educational computing, and what he'll eventually be remembered for (perhaps even with a biopic) is his dream of the Dynabook.
Conceived around 1968, this was to be a revolutionary keyboardless, wireless tablet computer permanently connected to an online library containing all the world's accumulated knowledge, and he wanted every child in the world to have access to one. Kay of course could never actually build a Dynabook because at the time PARC was developing graphical UIs there weren't even any microprocessors, let alone an internet. (His Xerox Star "personal" workstation was a fridge-sized minicomputer built with discrete logic chips). But the fact is that everything required hardwarewise to implement his idea now exists.
In fact my little Nexus 7 already comes pretty close to Kay's ideal, delivering information through Google search and Wikipedia, music and movies via Spotify, YouTube and other places. What it lacks is universality, and that's no longer a technical but a commercial problem, over ownership of both conduit and content and how much we're prepared to pay for them. A Dynabook was supposed to operate wirelessly from anywhere at all, which rules out my Nexus because it's Wi-Fi only - but even were I to buy a 3G tablet and SIM the cost of the data-plan might inhibit me from using it as freely as I do at home. Running a tablet off my home broadband is one thing, but I can't justify an extra account just for outside use. It's the same story with content: I already pay £10 per month to Spotify, justified because music is my main amusement, but similar subscriptions to Netflix or LoveFilm don't tempt me because they don't deal in films I want. If YouTube started charging (as it recently threatened) £5 a month for access I'd probably give that up too. For Kay's dream to finally come true we desperately need to rationalise both conduit and content licensing to make ubiquitous data access affordable.
Even here, all the necessary hardware and software tricks are already well known. Mobile phone companies have invested mega-billions to cover much of the globe with masts, supporting networks which, unlike the internet, already have an ability to charge for traffic built right in. A global system that mimiced those hierarchical storage systems already employed in enterprise-level computing could be assembled, using the internet as the trunk of a global tree, switching down onto mobile networks for its branches and finally onto village/street level Wi-Fi for its twigs. The problems aren't technically insuperable but colossal efforts of business diplomacy would be required to negotiate equitable distribution of revenues to the owners of these various "wires". In fact the problem facing the ITU would resemble the problem the UN faces in trying to broker world peace, but we're talking dreams here and my point is that the obstacle is no longer hardware.
The same would be true for collecting royalties on content, though Spotify, Netflix and the rest show that this can be done (with difficulty). In fact there's a strong case for heavily subsidising the conduits and charging mostly for the content, along the same historical lines followed by roads and railways. I believe that both of my Alan heroes would agree that a species which can't provide affordable information to educate its offspring, while loudly fantasising about colonising Mars and mining diamonds from asteroids, has pretty weird priorities.
Thursday, 4 July 2013
IDEALOG NOW AVAILABLE IN KINDLE FORMAT!
For truly dedicated fans of this column who find this scrolling blog format inconvenient, I've now made the first 200 columns available on Amazon as Kindle books. Two volumes of 100 columns each are available here for a very reasonable £5/$7 each:
http://www.amazon.co.uk/The-Compleat-Idealog-ebook/dp/B00DNI90KC/
http://www.amazon.co.uk/The-Compleat-Idealog-ebook/dp/B00DP1P75Y/
USA:
http://www.amazon.com/The-Compleat-Idealog-ebook/dp/B00DNI90KC/
http://www.amazon.com/The-Compleat-Idealog-ebook/dp/B00DP1P75Y/
(I decided on two volumes because a single one made an unwieldy 1000+ pages, and set a price to match)
RIP PC?
Dick Pountain - Idealog 222:
Jan 11th 2013
I'm currently conducting an
experiment whose outcome will profoundly affect the way I work in future: I'm
writing this column for the very first time on my Nexus 7 tablet rather than on my laptop, and I've
actually typed these first few sentences on its tiny on-screen keyboard using
Jelly Bean 4.2's 'gesture typing' feature, with which I've become rather
proficient over the last few weeks. (I'll tell you further down whether I stuck
with this resolution or chickened-out and fetched my Bluetooth keyboard). The
location of my experiment is our beloved chairman's glorious house on Mustique:
sure it's a tough assignment but someone has to do it, and I'm here toiling away
in the Caribbean sun so that you don't have to. I deliberately left London
without my laptop to see whether I could cope, and so far haven't missed it at
all. The Nexus has provided all my Spotify tunes, my YouTube movies, email
correspondence, and now text creation (in Word format) in faultless fashion.
Prophesying 'The Death of the
PC' is liable to embroil me in a raging troll-fest nowadays, but I can't help
it if the phrase just won't leave my head. Over the last couple of weeks I've been
reading several insightful analyses of the future prospects for both Intel and
Microsoft that leave me in no doubt that both firms are going to have to get
used to reduced rations rather soon.
Intel has unquestionably
missed the boat in the low-power processor sector: its strategic error in
believing the x86 architecture to be invulnerable looks increasingly like a
catastrophe that has granted ARM the same sort of six-year lead in the mobile
arena that Intel itself enjoyed all those years ago when IBM adopted the 8086
for its first PC. Intel is finally taking low-power seriously with new Atom
chipsets, but the sheer volume of Google's ARM-based Android may have shut that
door. Ironically enough, Intel actually owned a viable ARM-architecture range
in the shape of the xScale devices it inherited by the purchase of DEC, but it
never took them seriously - thanks to big-corporation inertia and hubris - and
its recently-departed CEO Paul Otellini sold them off to Marvell back in 2005
as one of his first acts.
Microsoft too has floundered in
trying to come to terms with mobileworld. It's not that it hasn't tried hard
enough: on top of various versions of Windows Mobile/Phone over the years it's
tried Ultra-Mobile PCs and even half-decent touch-screen Windows 'slates' like
those by Samsung, but none of them ever really took off (and the omens are not
good for the Surface to do any better). The reason is fundamentally the same as
for Intel: massive success imposes an absolute demand for compatibility which
stifles certain vital synergies.
Apple on the other hand has
always been ruthlessly pragmatic about
changing CPU vendors, first deserting Motorola for PowerPC, then moving
on to Intel and lately ARM whenever the time was right. And it had the courage
to innovate boldly in its user-interface design with iOS. Google meanwhile has
combined an open-source software model with agnosticism about hardware, and
none of the whingeing about Android fragmentation can diminish its big numbers.
The mobile market has become
a dinosaur trap financially too because profit margins on both hardware and
software sales are far, far shorter than the Wintel twins are used to, and
need. The cost of building fabs for ever smaller feature sizes becomes
prohibitive just as margins are shrinking, and Moore's Law is being revealed as
an increasingly tired marketing strategy rather than a science, now most users
demand more battery life rather than speed.
The ultimate demise of the PC
won't be in favour of Apple or any other hardware standard but rather in favour
of cloud vendors like Amazon, Google, eBay and the like, whose products and
services can be reached from *anyone's* mobile device. It would be wise for me
to cover my arse by pointing out there will always be a few PC niches left, but
I'm not sure I actually believe it. The vast grazing herds will be of thin mobile
clients, and generations will arise that never knew a mouse or keyboard - even
for business, even for accounts receivable.
And no, in the end I didn't
need to deploy my Bluetooth keyboard at all for this column. As any writer will
tell you, thinking up the next word takes far longer than to type it, so
absolute typing speed is not the critical step. (I'll confess that decades of
scribbling Graffiti have honed my sliding skills way beyond the average
though).
Subscribe to:
Posts (Atom)
POD PEOPLE
Dick Pountain /Idealog 366/ 05 Jan 2025 03:05 It’s January, when columnists feel obliged to reflect on the past year and who am I to refuse,...
-
Dick Pountain/Idealog 277/05 August 2017 11:05 I'm not overly prone to hero-worship, but that's not to say that I don't have a...
-
Dick Pountain /Idealog 360/ 07 Jul 2024 11:12 Astute readers <aside class=”smarm”> which of course to me means all of you </aside...
-
Dick Pountain /Idealog 363/ 05 Oct 2024 03:05 When I’m not writing this column, which let’s face it is most of the time, I perform a variety...