Dick Pountain/Idealog 240 /09 July 2014 11:45
My interest in computing has always been closely connected to my love of maths. I excelled in maths at school and could have studied it instead of chemistry (where would I be now?) My first experience of computing was in a 1960 school project to build an analog machine that could solve sixth-order differential equations. I used to look for patterns in the distribution of primes rather than collect football cards - you're probably getting the picture. I still occasionally get the urge to mess with maths, as for example when I recently discovered Mathlab's marvellous Graphing Calculator for Android, and I'll sometimes scribble some Ruby code to solve a problem that's popped into my head.
Of course I've been enormously pleased recently to witness the British establishment finally recognising the genius of Alan Turing, after a disgracefully long delay. It was Turing, in his 1936 paper on computable numbers, who more than anyone forged the link between mathematics and computing, though it's for his crucial wartime cryptography that he's remembered by a wider public. While Turing was working on computable numbers at King's College Cambridge, a college friend of his David Champernowne, another rmathematical prodigy, was working on something rather different that's recently come to fascinate me. Champernowne soon quit maths for economics; studied under John Maynard Keynes; helped organise aircraft production during WWII; in 1948 helped Turing write one of the first chess-playing programs; and then wrote the definitive book on income distribution and inequality (which happens be another interest of mine and is how I found him). But what Champernowne did back in 1933 at college was to build a new number.
That number, called the Champernowne Constant, has some pretty remarkable properties, which I'll try to explain here fairly gently. The number is very easy to construct: you could write a few million decimal places of it this weekend if you're at a loose end. In base 10 it's just zero, a decimal point, followed by the decimal representations of each successive integer concatenated, hence:
0.12345678910111213141516171819202122232425262728293031....
It's an irrational real number whose representation goes on for ever, and it's also transcendental (like pi) which means it's not the root of any polynomial equation. What most interested Champernowne is that it's "normal", which means that each digit 0-9, and each pair, triple and so on of such digits appear in it equally often. That ensures that any number you can think of, of whatever length, will appear somewhere in its expansion (an infinite number of times actually). It's the number of everything, and it turns out to be far smaller (if somewhat longer) than Douglas Adams' famous 42.
Your phone number and bankcard PIN, and mine, are in there somewhere, so it's sort of like the NSA's database in that respect. Fortunately though, unlike the NSA, they're very, very hard to locate. The Unicode-encoded text of every book, play and poem ever written, in every language (plus an infinite number of versions with an infinite number of spelling mistakes) is in there somewhere too, as are the MPEG4 encodings of every film and TV programme ever made (don't bother looking). The names and addresses of everyone on earth, again in Unicode, are in there, along with those same names with the wrong addresses. Perhaps most disturbingly of all, every possible truncated approximation to Champerknowne's constant itself should be in there, an infinite number of times, though I'll confess I haven't checked.
Aficionados of the Latin-American fiction will immediately see that Champernowne's constant is the numeric equivalent to Jorge Luis Borges' famous short story "The Library of Babel", in which an infinite number of librarians traipse up and down an infinite spiral staircase connecting shelves of random texts, searching for a single sentence that makes sense. However Champernownes' is a rather more humane construct, since not only does it consume far less energy and shoe-leather, but it also avoids the frequent suicides -- by leaping down the stairwell -- that Borges imagined.
A quite different legend concerns an Indian temple at Kashi Vishwanath, where Brahmin priests were supposed to continually swap 64 golden disks of graded sizes between three pillars (following the rules of that puzzle better known to computer scientists as the "Tower of Hanoi"). When they complete the last move of this puzzle, it's said the world will end. It can be shown that for priests of average agility this will take around 585 billion years, but we could remove even that small risk by persuading them to substitute instead a short Ruby program that builds Champerknownes' constant (we'll need the BigDecimal module!) to be left running on a succession of PCs. Then we could be absolutely certain that while nothing gets missed out, the end will never arrive...
My columns for PC Pro magazine, posted here six months in arrears for copyright reasons
Sunday, 11 January 2015
I, ROBOT?
Dick Pountain/ Idealog 239/ 06 June 2014 09:56
Like many males of my generation I grew up fairly well-disposed toward the robot. Robbie the Robot filmstar was all the rage when I was 11, and Asimov's Laws of Robotics engaged my attention as a teenaged sci-fi reader. By the time I became involved in publishing underground comics in the early 1970s the cuteness was wearing off robots, but even so the threat was moderated by humour. The late Vaughn Bodé - nowadays beloved by all the world's graffiti artists - drew a strip called "Junkwaffel" that depicted a world cleansed of humans but gripped in permanent war between foul-mouthed, wise-cracking robot soldiers. In some ways these were the (far rougher) prototypes of R2D2 and C3PO.
Out in the real world robots started to appear on factory production lines, but they were doing those horrible jobs that humans shouldn't do, like spraying cellulose paint, and humans were still being employed to do the other stuff. When I got involved in computer programming myself I was drawn toward robotics thanks to an interest in Forth, a language originally invented to control observatory telescopes and ideally suited to robot programming. The problems of robots back then were all about *training* them to perform desired motions (as opposed to spelling out in X,Y,Z coordinates) and building-in enough intelligence to give them more and more autonomy. I still vividly remember my delight when a roboticist friend at Bristol Uni showed me robot ducklings they'd built that followed each other just like the real thing, using vision alone.
Given this background, it will come rather hard to have to change my disposition toward the robot, but events in today's world are conspiring to force me to do just that. While reading a recent issue of New Scientist (26 April 2014), I was struck by two wholly unrelated articles that provide a powerful incentive for such a change of attitude. The first of these involved the Russian Strategic Missile Force, which has for the first time deliberately violated Asimov's main law by building a fully-autonomous lethal robot that requires no permission from a human to kill.
The robot in question is a bit aesthetically disappointing in that it's not even vaguely humanoid-looking: it looks like, indeed *is*, a small armoured car on caterpillar tracks that wields a 12.7mm heavy machine gun under radar, camera and laser control. It's being deployed to guard missile sites, and will open fire if it sees someone it doesn't like the look of. I do hope it isn't using a Windows 8 app for a brain. Whatever your views on the morality of the US drone fleet, it's important to realise that this is something quite different. Drones are remotely controlled by humans, and can only fire their weapons on command from a human, who must make all the necessary tactical and moral decisions. The Russian robot employs an algorithm to make those decisions. Imagine being held-up at gunpoint by Siri and you'll get the difference.
However it was the other article that profoundly upset my digestive system, an interview with Andrew McAfee, research scientist at MIT's Center for Digital Business. Asked by interviewer Niall Firth "Are robots really taking our jobs?", McAfee replied with three possible scenarios: first, that robots will in the short term, but a new equilibrium will be reached as it was after the first Industrial Revolution; second, they'll replace more and more professions and massive retraining will be essential to keep up; third, the sci-fi-horror scenario where robots can perform almost all jobs and "you just don't need a lot of labour". He thinks we'll see scenario three in his lifetime (which I hope and trust will be longer than mine).
It was when he was then asked about any possible upside that my mind boggled and my gorge rose: the "bounty" he saw arising was a greater variety of stuff of higher quality at lower prices, and most importantly "you don't need money to buy access to Instagram, Facebook or Wikipedia". That's just as well really, since no-one except the 0.1% who own the robots will have any money. On that far-off day I forsee when a guillotine (of 3D-printed stainless steel) has been erected outside Camden Town tube-station, McAfee may be still remembered as a 21st-century Marie Antoinette for that line.
The bottom line is that robots are still really those engaging toys-for-boys that I fell for back in the 1950s, but economics and politics require the presence of grown-ups. Regrettably the supply of grown-ups has been dwindling alarmingly since John Maynard Keynes saved us from such imbecilities the last time around. If you're going to make stuff, you have to pay people enough to buy that stuff, simples.
Like many males of my generation I grew up fairly well-disposed toward the robot. Robbie the Robot filmstar was all the rage when I was 11, and Asimov's Laws of Robotics engaged my attention as a teenaged sci-fi reader. By the time I became involved in publishing underground comics in the early 1970s the cuteness was wearing off robots, but even so the threat was moderated by humour. The late Vaughn Bodé - nowadays beloved by all the world's graffiti artists - drew a strip called "Junkwaffel" that depicted a world cleansed of humans but gripped in permanent war between foul-mouthed, wise-cracking robot soldiers. In some ways these were the (far rougher) prototypes of R2D2 and C3PO.
Out in the real world robots started to appear on factory production lines, but they were doing those horrible jobs that humans shouldn't do, like spraying cellulose paint, and humans were still being employed to do the other stuff. When I got involved in computer programming myself I was drawn toward robotics thanks to an interest in Forth, a language originally invented to control observatory telescopes and ideally suited to robot programming. The problems of robots back then were all about *training* them to perform desired motions (as opposed to spelling out in X,Y,Z coordinates) and building-in enough intelligence to give them more and more autonomy. I still vividly remember my delight when a roboticist friend at Bristol Uni showed me robot ducklings they'd built that followed each other just like the real thing, using vision alone.
Given this background, it will come rather hard to have to change my disposition toward the robot, but events in today's world are conspiring to force me to do just that. While reading a recent issue of New Scientist (26 April 2014), I was struck by two wholly unrelated articles that provide a powerful incentive for such a change of attitude. The first of these involved the Russian Strategic Missile Force, which has for the first time deliberately violated Asimov's main law by building a fully-autonomous lethal robot that requires no permission from a human to kill.
The robot in question is a bit aesthetically disappointing in that it's not even vaguely humanoid-looking: it looks like, indeed *is*, a small armoured car on caterpillar tracks that wields a 12.7mm heavy machine gun under radar, camera and laser control. It's being deployed to guard missile sites, and will open fire if it sees someone it doesn't like the look of. I do hope it isn't using a Windows 8 app for a brain. Whatever your views on the morality of the US drone fleet, it's important to realise that this is something quite different. Drones are remotely controlled by humans, and can only fire their weapons on command from a human, who must make all the necessary tactical and moral decisions. The Russian robot employs an algorithm to make those decisions. Imagine being held-up at gunpoint by Siri and you'll get the difference.
However it was the other article that profoundly upset my digestive system, an interview with Andrew McAfee, research scientist at MIT's Center for Digital Business. Asked by interviewer Niall Firth "Are robots really taking our jobs?", McAfee replied with three possible scenarios: first, that robots will in the short term, but a new equilibrium will be reached as it was after the first Industrial Revolution; second, they'll replace more and more professions and massive retraining will be essential to keep up; third, the sci-fi-horror scenario where robots can perform almost all jobs and "you just don't need a lot of labour". He thinks we'll see scenario three in his lifetime (which I hope and trust will be longer than mine).
It was when he was then asked about any possible upside that my mind boggled and my gorge rose: the "bounty" he saw arising was a greater variety of stuff of higher quality at lower prices, and most importantly "you don't need money to buy access to Instagram, Facebook or Wikipedia". That's just as well really, since no-one except the 0.1% who own the robots will have any money. On that far-off day I forsee when a guillotine (of 3D-printed stainless steel) has been erected outside Camden Town tube-station, McAfee may be still remembered as a 21st-century Marie Antoinette for that line.
The bottom line is that robots are still really those engaging toys-for-boys that I fell for back in the 1950s, but economics and politics require the presence of grown-ups. Regrettably the supply of grown-ups has been dwindling alarmingly since John Maynard Keynes saved us from such imbecilities the last time around. If you're going to make stuff, you have to pay people enough to buy that stuff, simples.
THE JOY OF CODING?
Dick Pountain/ Idealog 238/ 08 May 2014 19:30
I've admitted many times in this column that I actually enjoy programming, and mostly do it for fun. In fact I far prefer programming to playing games. Given my other interests, people are often surprised that I don't enjoy chess, but the truth is that the sort of problems it creates don't interest me: I simply can't be bothered to hurt my brain thinking seven moves ahead when all that's at stake is beating the other guy. I did enjoy playing with Meccano as a kid, and did make that travelling gantry crane. I can even imagine the sort of satisfaction that might arise from building Chartres Cathedral out of Lego, though having children myself rendered me phobic about the sound of spilling Lego bricks (and the pain of stepping on one in bare feet). But programming is the ultimate construction game, where your opponent is neither person nor computer but the complexity of reality itself.
Object-oriented programming is especially rewarding that way. You can simulate anything you can imagine, describe its properties and its behaviours, then - by typing a single line of code - create a thousand (or a million) copies of it and set them all working. Then call that whole system an object and create a hundred copies of that. It's all stuff you can't do in the heavy, inertial, expensive world of matter: making plastic bits and pieces by 3D printing may be practical, even useful, but it lacks this Creator of Worlds buzz.
Since I'm so besotted by programming as recreation, I must surely be very excited by our government's "Year of Code" initiative, which aims to teach all our children how to write programs - or about "coding" as the current irritating locution would have it? Actually, no I'm not. I'm perfectly well aware that my taste for programming as recreation is pretty unusual, very far from universal, perhaps even eccentric, a bit like Base Jumping or worm farming. The idea that every child in the country is going to develop a such taste is ludicrous, and that rules out coding for pleasure as a rationale. It will most likely prove as unpleasant as maths to a lot of kids, and put them off for life.
But what about "coding" as job skill, as vital life equipment for gaining employment in our new digital era? Well there's certainly a need for a lot of programmers, and the job does pay well above average. However you can say much the same about plumbers, electricians and motor mechanics, and no-one is suggesting that all children should be taught those skills. The aim is to train school teachers to teach coding, but it makes no more sense for every child to learn programming than it does to wire up a ring-main or install a cistern. Someone who decides to pursue programming as a profession needs solid tuition in maths and perhaps physics, plus the most basic principles of programming like iteration and conditionality which ought to be part of the maths curriculum anyway. Actual programming in real languages is for tertiary education, not for the age of five as the Year of Code seeks.
The whole affair reeks of the kind of gimmicky policy a bunch of arts and humanities graduates, clueless about technology, might think up after getting an iPad for Christmas and being bowled over by the wonderful new world of digital communications. Their kids probably already know more about "coding" than they do via self-tuition. However there are those who detect a more sinister odour in the air. For example Andrew Orlowski, curmudgeon-in-chief at The Register, has pointed out a network of training companies and consultants who stand to make big bucks out of the Year of Code, in much the same way firms did during the Y2K panic: they include venture capital company Index Ventures, which has Year of Code's chairman Rohan Silva as its "Entrepreneur in Residence", and US training company Codecademy. Organisations that are already reaching children who are actually interested in programming, like the Raspberry Pi foundation, appear to be sidelined and cold-shouldered by the hype merchants: the foundation's development director Clive Beale has claimed that "The word 'coding' has been hijacked and abused by politicians and media who don't understand stuff”.
Personally I'd love to believe all children could be taught to gain as much pleasure from programming as I do, but it's unlikely. Like singing, dancing, drawing, playing football, some can do it and like it, others can't and won't, and the notion that everyone has to be able do it for the sake of the national economy has unpleasantly Maoist undertones, with backyard code foundries instead of steelworks.
I've admitted many times in this column that I actually enjoy programming, and mostly do it for fun. In fact I far prefer programming to playing games. Given my other interests, people are often surprised that I don't enjoy chess, but the truth is that the sort of problems it creates don't interest me: I simply can't be bothered to hurt my brain thinking seven moves ahead when all that's at stake is beating the other guy. I did enjoy playing with Meccano as a kid, and did make that travelling gantry crane. I can even imagine the sort of satisfaction that might arise from building Chartres Cathedral out of Lego, though having children myself rendered me phobic about the sound of spilling Lego bricks (and the pain of stepping on one in bare feet). But programming is the ultimate construction game, where your opponent is neither person nor computer but the complexity of reality itself.
Object-oriented programming is especially rewarding that way. You can simulate anything you can imagine, describe its properties and its behaviours, then - by typing a single line of code - create a thousand (or a million) copies of it and set them all working. Then call that whole system an object and create a hundred copies of that. It's all stuff you can't do in the heavy, inertial, expensive world of matter: making plastic bits and pieces by 3D printing may be practical, even useful, but it lacks this Creator of Worlds buzz.
Since I'm so besotted by programming as recreation, I must surely be very excited by our government's "Year of Code" initiative, which aims to teach all our children how to write programs - or about "coding" as the current irritating locution would have it? Actually, no I'm not. I'm perfectly well aware that my taste for programming as recreation is pretty unusual, very far from universal, perhaps even eccentric, a bit like Base Jumping or worm farming. The idea that every child in the country is going to develop a such taste is ludicrous, and that rules out coding for pleasure as a rationale. It will most likely prove as unpleasant as maths to a lot of kids, and put them off for life.
But what about "coding" as job skill, as vital life equipment for gaining employment in our new digital era? Well there's certainly a need for a lot of programmers, and the job does pay well above average. However you can say much the same about plumbers, electricians and motor mechanics, and no-one is suggesting that all children should be taught those skills. The aim is to train school teachers to teach coding, but it makes no more sense for every child to learn programming than it does to wire up a ring-main or install a cistern. Someone who decides to pursue programming as a profession needs solid tuition in maths and perhaps physics, plus the most basic principles of programming like iteration and conditionality which ought to be part of the maths curriculum anyway. Actual programming in real languages is for tertiary education, not for the age of five as the Year of Code seeks.
The whole affair reeks of the kind of gimmicky policy a bunch of arts and humanities graduates, clueless about technology, might think up after getting an iPad for Christmas and being bowled over by the wonderful new world of digital communications. Their kids probably already know more about "coding" than they do via self-tuition. However there are those who detect a more sinister odour in the air. For example Andrew Orlowski, curmudgeon-in-chief at The Register, has pointed out a network of training companies and consultants who stand to make big bucks out of the Year of Code, in much the same way firms did during the Y2K panic: they include venture capital company Index Ventures, which has Year of Code's chairman Rohan Silva as its "Entrepreneur in Residence", and US training company Codecademy. Organisations that are already reaching children who are actually interested in programming, like the Raspberry Pi foundation, appear to be sidelined and cold-shouldered by the hype merchants: the foundation's development director Clive Beale has claimed that "The word 'coding' has been hijacked and abused by politicians and media who don't understand stuff”.
Personally I'd love to believe all children could be taught to gain as much pleasure from programming as I do, but it's unlikely. Like singing, dancing, drawing, playing football, some can do it and like it, others can't and won't, and the notion that everyone has to be able do it for the sake of the national economy has unpleasantly Maoist undertones, with backyard code foundries instead of steelworks.
Tuesday, 23 September 2014
SOMETHING WENT WRONG
Dick Pountain/ Idealog 237/ 07 April 2014 11:48
I'm writing this column on a new computer on 8th April, which may or may not come to be known as Save Microsoft's Ass Day. For the benefit of any members of exotic tribes, like OSX or Linux users, it's the day on which the major update to Windows 8.1 is being released, the fate of which might determine the fate of Microsoft. I don't have the update myself, and will be waiting to see whether it bricks everyone's PCs before I indulge (in itself a testament to the esteem in which MS is currently held).
But, I hear you thinking, didn't he say just a few columns ago that he may never buy another Windows PC? He did indeed, but then he succumbed to an unforgiveable fit of nostalgia and misplaced loyalty and did precisely that. No sooner had I written that previous column than the hard disk on my trusty 7-year-old Viao began to show symptoms of approaching retirement, and it became wise to put my money where my mouth had been. I shopped around, was sorely tempted to "go commando" with an Asus Transformer, and devilishly tempted by the sheer beauty of Google's hi-def Chromebook, but in the end I stuck with Windows for a combination of pragmatic and sentimental reasons. I'm already beginning to regret it.
The pragmatic reason was that I don't completely trust the cloud. I'm happy enough to exploit it and have done business from foreign shores using only my Nexus 7 and an internet connection, but I want a local copy of all my significant data too. Android can sort of hack that, but it's not what it was designed for (the fact that a file manager is a third-party app gives you the clue). The sentimental reason was the 20 years I've spent fiddling with, tweaking, boosting, wrestling, writing software for, swearing at and rescuing Windows. Hate it, certainly, but it was also great mental exercise on a par with playing chess or planning a guerilla war. So I caved, bought a Lenovo Yoga 2 running Windows 8 and plunged ahead into the quagmire. I resolved to upgrade immediately to Windows 8.1 before getting too settled (ha!) so went to the Store, only to find no upgrade there. A traipse through the forums revealed that it won't be visible on some PCs until you manually apply two other upgrades called K123blah and K123blech. So far so hostile, in fact downright unprofessional, by both MS and Lenovo.
With 8.1 in place I started to investigate the Metrosexual interface and found I didn't mind it so much as many other commentators, since I'm now totally atuned to Android and touch. Tiles make quite a good substitute for the Start Menu I never used, having always preferred desktop icons. Things I do most - mail, calendar, writing, reading, Googling and note-taking - all fit onto the first Start screen, always available via the Windows key as I work in desktop view. But irritation set in once I discovered there aren't any official "Modern Interface" versions of most of the programs I use (like Gmail, Blogger, Flickr, YouTube, iPlayer). You can fiddle this by viewing them in a browser window and making tiles from their URLs, if you don't mind using Internet Explorer, which I do mind. Using Firefox, as I used to, you can't (and in any case it runs too slowly to be usable). Using Chrome, as I now do, it's hidden under a menu that another traipse through the forums revealed. Then one-by-one, those tiled Win8 apps I could find started to break down. It happened to the Facebook app, to a calendar app I paid, admittedly pence, for which no longer logs into Google, and to an icon editor that no longer saves its files. What's really nice though is that to avoid giving anxiety and offence (a cardinal sin in the modern world) software vendors are adopting a new, softer and content-free error message: "Something Went Wrong". No shit Sherlock.
As I edit Jon Honeyball's column each month I quail before his volcanic wrath against the Windows 8 ecosystem, but I now realise Jon has actually been pretty moderate: the quality of apps in the Windows Store is so abysmal it actually makes me nervous, like wandering down the wrong alley in an unfamiliar city and seeing people lurk in dark doorways. Win8 apps now cause me the same unease I feel whenever forced to use a Macintosh, a loss of control and bewilderment at its sheer opacity. Fortunately the Google Apps button in Chrome lets me turn my cute little Yoga into something resembling the Chromebook I almost bought! Windows 8.1 update will need to be *really, really* something...
I'm writing this column on a new computer on 8th April, which may or may not come to be known as Save Microsoft's Ass Day. For the benefit of any members of exotic tribes, like OSX or Linux users, it's the day on which the major update to Windows 8.1 is being released, the fate of which might determine the fate of Microsoft. I don't have the update myself, and will be waiting to see whether it bricks everyone's PCs before I indulge (in itself a testament to the esteem in which MS is currently held).
But, I hear you thinking, didn't he say just a few columns ago that he may never buy another Windows PC? He did indeed, but then he succumbed to an unforgiveable fit of nostalgia and misplaced loyalty and did precisely that. No sooner had I written that previous column than the hard disk on my trusty 7-year-old Viao began to show symptoms of approaching retirement, and it became wise to put my money where my mouth had been. I shopped around, was sorely tempted to "go commando" with an Asus Transformer, and devilishly tempted by the sheer beauty of Google's hi-def Chromebook, but in the end I stuck with Windows for a combination of pragmatic and sentimental reasons. I'm already beginning to regret it.
The pragmatic reason was that I don't completely trust the cloud. I'm happy enough to exploit it and have done business from foreign shores using only my Nexus 7 and an internet connection, but I want a local copy of all my significant data too. Android can sort of hack that, but it's not what it was designed for (the fact that a file manager is a third-party app gives you the clue). The sentimental reason was the 20 years I've spent fiddling with, tweaking, boosting, wrestling, writing software for, swearing at and rescuing Windows. Hate it, certainly, but it was also great mental exercise on a par with playing chess or planning a guerilla war. So I caved, bought a Lenovo Yoga 2 running Windows 8 and plunged ahead into the quagmire. I resolved to upgrade immediately to Windows 8.1 before getting too settled (ha!) so went to the Store, only to find no upgrade there. A traipse through the forums revealed that it won't be visible on some PCs until you manually apply two other upgrades called K123blah and K123blech. So far so hostile, in fact downright unprofessional, by both MS and Lenovo.
With 8.1 in place I started to investigate the Metrosexual interface and found I didn't mind it so much as many other commentators, since I'm now totally atuned to Android and touch. Tiles make quite a good substitute for the Start Menu I never used, having always preferred desktop icons. Things I do most - mail, calendar, writing, reading, Googling and note-taking - all fit onto the first Start screen, always available via the Windows key as I work in desktop view. But irritation set in once I discovered there aren't any official "Modern Interface" versions of most of the programs I use (like Gmail, Blogger, Flickr, YouTube, iPlayer). You can fiddle this by viewing them in a browser window and making tiles from their URLs, if you don't mind using Internet Explorer, which I do mind. Using Firefox, as I used to, you can't (and in any case it runs too slowly to be usable). Using Chrome, as I now do, it's hidden under a menu that another traipse through the forums revealed. Then one-by-one, those tiled Win8 apps I could find started to break down. It happened to the Facebook app, to a calendar app I paid, admittedly pence, for which no longer logs into Google, and to an icon editor that no longer saves its files. What's really nice though is that to avoid giving anxiety and offence (a cardinal sin in the modern world) software vendors are adopting a new, softer and content-free error message: "Something Went Wrong". No shit Sherlock.
As I edit Jon Honeyball's column each month I quail before his volcanic wrath against the Windows 8 ecosystem, but I now realise Jon has actually been pretty moderate: the quality of apps in the Windows Store is so abysmal it actually makes me nervous, like wandering down the wrong alley in an unfamiliar city and seeing people lurk in dark doorways. Win8 apps now cause me the same unease I feel whenever forced to use a Macintosh, a loss of control and bewilderment at its sheer opacity. Fortunately the Google Apps button in Chrome lets me turn my cute little Yoga into something resembling the Chromebook I almost bought! Windows 8.1 update will need to be *really, really* something...
Sunday, 10 August 2014
THOSE PESKY ATOMS
Dick Pountain/PC Pro/Idealog 236 06/03/2014
Foot-to-ground contact is pretty important to us motorcyclists so we get picky about our boots. I favour an Australian stockman's style that can pass for a fashionable Chelsea Boot in polite society. Having worn out my second pair of R.M.Williams after 15 years yesterday I went shopping for new. I checked Russell & Bromley and John Lewis on the web, then set off to town to try some on. Why didn't I buy online? Because I need to try boots on, and you can neither upload your feet nor download boots over the web, which still only handles bits, not pesky atoms. I'd never consider buying boots from Amazon, though I did go there after my purchase to snivel quietly about the £8 I could have saved...
Russell B's lovely boots were too narrow for my broad feet and John Lewis didn't have the ones advertised on their website, so I ended up buying Blundstones (which are fab and half the price of R.M.Williams) from the Natural Shoe Store. Later that day I realised there's a moral to this gripping tale, as I was reading John Lewis's announcement of its massive new IT project: an Oracle-based ERP (Enterprise Resource Planning) system that will completely integrate the firms' online and physical stores, including all the Waitrose grocers. No cost was quoted for this four-year project - scheduled to run in 2017 - but it will certainly be both expensive and risky.
Manufacturers have only a slightly better record than public-sector institutions when it comes to screwing up big IT: in recent years major corporations from Avon to Hershey and Levi's Jeans have lost fortunes botching or cancelling ERP projects. If anyone can pull it off it might be John Lewis, whose overall competence is renowned. It first pioneered Click & Collect, where you choose a product on the website and then collect it from your nearest store, though all its competitors now do the same. But C&C is only one permutation people use to bridge the bits/atoms gap. Some folk research products in the bricks-and-mortar store, then go home and order from the website. Some fear online shopping and prefer to order by phone from a human. As for browsing the site, they might use a mobile, tablet or a PC. Hence the new buzzword is "omni-channel", and it matters enormously because all of these modes of e-commerce will fail - like my boot purchase - if stock in stores isn't accurately reflected on the website. That demands a whole new level of integration of stock-control and delivery systems, which for a grocery operation like Waitrose that delivers perishable foodstuffs will be ferociously hard. The new project is ambitious indeed.
This is clearly the new frontline of online retailing. There are more and more items like TV sets, clothes, shoes, high-end acoustic musical instruments, possibly furniture and fabrics, that people won't be satisfied to buy from a purely online outlet like Amazon but need to see and touch before choosing. Admittedly a lot of people go to bricks-and-mortar stores to browse, then go home an buy from Amazon, but the stores are getting wise to this. I imagine that John Lewis's new system, assuming it works, is intended to make it so easy to buy-as-you-handle that you won't want Amazon. Meanwhile Amazon and Google are both leaking weirdly futuristic plans for delivering atoms rather than bits independently of the Post Office or courier service. Amazon's vision involves flocks of quadcopter drones, delivering your purchases down the chimney like the storks of legend. Google, with its feet more firmly on the ground, buys up robotics firms: I particularly like their galloping headless-heifer robot, which would make quite a stir as it rumbled round the corner into our street towing a sofa (especially if chased by a squawking flock of quadcopters... )
Omens are gathering that the power of silicon valley giants has peaked, just as the oil, coal and railway barons' power did in the 1900s: even the US Right is getting chippy about the amount of tax they avoid (which means taking more tax from civilians); among Democrats there are populist stirrings about their almost-jobless business model and exploitation of interns; and the San Francisco bus protests are seriously tarnishing their public image. And all that Kurtzweilian Transhumanist/Matrix/Singularity nonsense looks more and more like a religious cult, a post-modern reinvention of a Protestant Millennium. We might spend a lot of time watching movies and listening to music in bit-land but we're never going to live there full-time because we're made of atoms, we eat atoms, breath atoms and wear atoms. And bricks-and-mortar shops have a head start when it comes to distributing atoms in bulk: just watch them start the fight back.
Foot-to-ground contact is pretty important to us motorcyclists so we get picky about our boots. I favour an Australian stockman's style that can pass for a fashionable Chelsea Boot in polite society. Having worn out my second pair of R.M.Williams after 15 years yesterday I went shopping for new. I checked Russell & Bromley and John Lewis on the web, then set off to town to try some on. Why didn't I buy online? Because I need to try boots on, and you can neither upload your feet nor download boots over the web, which still only handles bits, not pesky atoms. I'd never consider buying boots from Amazon, though I did go there after my purchase to snivel quietly about the £8 I could have saved...
Russell B's lovely boots were too narrow for my broad feet and John Lewis didn't have the ones advertised on their website, so I ended up buying Blundstones (which are fab and half the price of R.M.Williams) from the Natural Shoe Store. Later that day I realised there's a moral to this gripping tale, as I was reading John Lewis's announcement of its massive new IT project: an Oracle-based ERP (Enterprise Resource Planning) system that will completely integrate the firms' online and physical stores, including all the Waitrose grocers. No cost was quoted for this four-year project - scheduled to run in 2017 - but it will certainly be both expensive and risky.
Manufacturers have only a slightly better record than public-sector institutions when it comes to screwing up big IT: in recent years major corporations from Avon to Hershey and Levi's Jeans have lost fortunes botching or cancelling ERP projects. If anyone can pull it off it might be John Lewis, whose overall competence is renowned. It first pioneered Click & Collect, where you choose a product on the website and then collect it from your nearest store, though all its competitors now do the same. But C&C is only one permutation people use to bridge the bits/atoms gap. Some folk research products in the bricks-and-mortar store, then go home and order from the website. Some fear online shopping and prefer to order by phone from a human. As for browsing the site, they might use a mobile, tablet or a PC. Hence the new buzzword is "omni-channel", and it matters enormously because all of these modes of e-commerce will fail - like my boot purchase - if stock in stores isn't accurately reflected on the website. That demands a whole new level of integration of stock-control and delivery systems, which for a grocery operation like Waitrose that delivers perishable foodstuffs will be ferociously hard. The new project is ambitious indeed.
This is clearly the new frontline of online retailing. There are more and more items like TV sets, clothes, shoes, high-end acoustic musical instruments, possibly furniture and fabrics, that people won't be satisfied to buy from a purely online outlet like Amazon but need to see and touch before choosing. Admittedly a lot of people go to bricks-and-mortar stores to browse, then go home an buy from Amazon, but the stores are getting wise to this. I imagine that John Lewis's new system, assuming it works, is intended to make it so easy to buy-as-you-handle that you won't want Amazon. Meanwhile Amazon and Google are both leaking weirdly futuristic plans for delivering atoms rather than bits independently of the Post Office or courier service. Amazon's vision involves flocks of quadcopter drones, delivering your purchases down the chimney like the storks of legend. Google, with its feet more firmly on the ground, buys up robotics firms: I particularly like their galloping headless-heifer robot, which would make quite a stir as it rumbled round the corner into our street towing a sofa (especially if chased by a squawking flock of quadcopters... )
Omens are gathering that the power of silicon valley giants has peaked, just as the oil, coal and railway barons' power did in the 1900s: even the US Right is getting chippy about the amount of tax they avoid (which means taking more tax from civilians); among Democrats there are populist stirrings about their almost-jobless business model and exploitation of interns; and the San Francisco bus protests are seriously tarnishing their public image. And all that Kurtzweilian Transhumanist/Matrix/Singularity nonsense looks more and more like a religious cult, a post-modern reinvention of a Protestant Millennium. We might spend a lot of time watching movies and listening to music in bit-land but we're never going to live there full-time because we're made of atoms, we eat atoms, breath atoms and wear atoms. And bricks-and-mortar shops have a head start when it comes to distributing atoms in bulk: just watch them start the fight back.
WHAT'S A MOOC?
Dick Pountain/PC Pro/Idealog 235 05/02/2014
Fans of Scorsese's movie "Mean Streets" must certainly remember the pool-hall scene where Jimmy is called a "mook" and responds by asking what that means (we never quite find out). I was irresistibly reminded of this scene as I read an excellent recent column by John Lanchester in the London Review of Books (http://www.lrb.co.uk/v35/n22/john-lanchester/short-cuts) in which he discusses the MOOC (Massive Online Open Course), a type of distance learning increasingly being offered by US universities. In Lanchester's case what attracted him to a MOOC was a Harvard course on Food Science given by Ferran Adrià, famous chef of the now defunct Spanish super-restaurant El Bulli. Not many years ago gaining admission to Harvard lectures would have cost even more than dinner at El Bulli, but he was able to sign up for SPU27 and take the course on his iPad for free .
SPU27 forms part of a joint project for online learning between Stanford, Harvard and MIT called EdX. (Stanford pioneered the MOOC several years ago via iTunes, though of course our own Open University a far earlier pioneer using the ancient medium of analog terrestrial television). The idea of such courses is that they can "flip the classroom", so that instead of attending lectures students view them online and do their coursework work at home, visiting the campus only very occasionally to be tested and discuss difficulties. Advantages for the university are substantial: it can save on the cost of maintaining physical lecture halls and presumably stretch lecturers' salaries over far more students than can be fitted into a theatre. Lanchester forsees MOOCs becoming ever more important as university admission fees escalate while prospective students' earning-power falls, but he also forsees them putting some universities out of business altogether. For a MOOC, as for any other online content provider, attracting custom will depend upon effective viral marketing and hiring star performers like Ferran Adrià or Bruce Sterling.
As for the quality of MOOC tuition, Lanchester found SPU27 harder and more rigorous than he'd expected, though he does acknowledge a loss of personal interaction among students and lecturers. But the fact that MOOCs are tolerable at all is testament, as if any more were needed, to a computer/telecoms revolution that's now entered the post-PC phase. Many MOOC students will probably prefer to watch their lectures on a large-screen smart TV at home, on a tablet in the park or on the bus to their day-job. Last year in a column about Alan Kay and his Dynabook (Idealog 223) I felt obliged to point out that increasing monopolisation of copyrighted media content by big corporations was becoming an obstacle to its fullest implementation. Well, MOOCs offer one more source of free high-quality educational content, presumably subsidised by those high fees paid by physically-attending students. Free tuition could even revive that old idea of education for its own sake, rather than just for a job.
Market competition is working pretty well to reduce the cost and increase capabilities of the hardware you need. My first-generation LCD TV died the other day and I found the cheapest replacement was a 29" LED model from LG. Its picture quality is a revelation, in both sharpness and colour fidelity, but I was a bit sceptical about its smartness. I needn't have worried because it immediately found my home Wi-Fi and I was watching YouTube and reading Gmail within minutes. It finds my laptop too and plays content from its hard disk. Like Mr Honeyball I find the on-screen keyboard deeply depressing, but I've found some solace through an LG TV Remote Android app that lets me enter text into most forms and search-boxes via gesture typing, Bluetooth keyboard or even speech. It doesn't work with Google Docs though, as TV and tablet keyboards get hopelessly tangled. I've added my own 500 gig external hard drive to the LG's USB port for rewind and programme recording, and paired TV and Nexus to stream YouTube content directly without need for a Chromecast. And it came with built-in Netflix, Lovefilm and iPlayer, but irritatingly not 4oD.
Despite its weaknesses I can still easily imagine watching lectures on smart TV and answering multiple-choice test questions via the tablet. Data formats are no longer really a problem as I can shovel PDFs, JPEGs, MP3 and MP4s with ease between Windows, Android and TV. Google's new free Quickoffice handles the Microsoft Office formats pretty well (and I keep DocsToGo as backup for anything they can't). It feels as though a chilling wind of change is blowing right through the Stanford campus all the way to Redmond, and I seriously wonder whether I'll ever buy another Windows PC. I hope that doesn't make me a mook (whatever that means).
Fans of Scorsese's movie "Mean Streets" must certainly remember the pool-hall scene where Jimmy is called a "mook" and responds by asking what that means (we never quite find out). I was irresistibly reminded of this scene as I read an excellent recent column by John Lanchester in the London Review of Books (http://www.lrb.co.uk/v35/n22/john-lanchester/short-cuts) in which he discusses the MOOC (Massive Online Open Course), a type of distance learning increasingly being offered by US universities. In Lanchester's case what attracted him to a MOOC was a Harvard course on Food Science given by Ferran Adrià, famous chef of the now defunct Spanish super-restaurant El Bulli. Not many years ago gaining admission to Harvard lectures would have cost even more than dinner at El Bulli, but he was able to sign up for SPU27 and take the course on his iPad for free .
SPU27 forms part of a joint project for online learning between Stanford, Harvard and MIT called EdX. (Stanford pioneered the MOOC several years ago via iTunes, though of course our own Open University a far earlier pioneer using the ancient medium of analog terrestrial television). The idea of such courses is that they can "flip the classroom", so that instead of attending lectures students view them online and do their coursework work at home, visiting the campus only very occasionally to be tested and discuss difficulties. Advantages for the university are substantial: it can save on the cost of maintaining physical lecture halls and presumably stretch lecturers' salaries over far more students than can be fitted into a theatre. Lanchester forsees MOOCs becoming ever more important as university admission fees escalate while prospective students' earning-power falls, but he also forsees them putting some universities out of business altogether. For a MOOC, as for any other online content provider, attracting custom will depend upon effective viral marketing and hiring star performers like Ferran Adrià or Bruce Sterling.
As for the quality of MOOC tuition, Lanchester found SPU27 harder and more rigorous than he'd expected, though he does acknowledge a loss of personal interaction among students and lecturers. But the fact that MOOCs are tolerable at all is testament, as if any more were needed, to a computer/telecoms revolution that's now entered the post-PC phase. Many MOOC students will probably prefer to watch their lectures on a large-screen smart TV at home, on a tablet in the park or on the bus to their day-job. Last year in a column about Alan Kay and his Dynabook (Idealog 223) I felt obliged to point out that increasing monopolisation of copyrighted media content by big corporations was becoming an obstacle to its fullest implementation. Well, MOOCs offer one more source of free high-quality educational content, presumably subsidised by those high fees paid by physically-attending students. Free tuition could even revive that old idea of education for its own sake, rather than just for a job.
Market competition is working pretty well to reduce the cost and increase capabilities of the hardware you need. My first-generation LCD TV died the other day and I found the cheapest replacement was a 29" LED model from LG. Its picture quality is a revelation, in both sharpness and colour fidelity, but I was a bit sceptical about its smartness. I needn't have worried because it immediately found my home Wi-Fi and I was watching YouTube and reading Gmail within minutes. It finds my laptop too and plays content from its hard disk. Like Mr Honeyball I find the on-screen keyboard deeply depressing, but I've found some solace through an LG TV Remote Android app that lets me enter text into most forms and search-boxes via gesture typing, Bluetooth keyboard or even speech. It doesn't work with Google Docs though, as TV and tablet keyboards get hopelessly tangled. I've added my own 500 gig external hard drive to the LG's USB port for rewind and programme recording, and paired TV and Nexus to stream YouTube content directly without need for a Chromecast. And it came with built-in Netflix, Lovefilm and iPlayer, but irritatingly not 4oD.
Despite its weaknesses I can still easily imagine watching lectures on smart TV and answering multiple-choice test questions via the tablet. Data formats are no longer really a problem as I can shovel PDFs, JPEGs, MP3 and MP4s with ease between Windows, Android and TV. Google's new free Quickoffice handles the Microsoft Office formats pretty well (and I keep DocsToGo as backup for anything they can't). It feels as though a chilling wind of change is blowing right through the Stanford campus all the way to Redmond, and I seriously wonder whether I'll ever buy another Windows PC. I hope that doesn't make me a mook (whatever that means).
Wednesday, 16 July 2014
ALGORITHMIC BEAUTY
Dick Pountain/ Idealog 234/ 7th Jan 2014
The day I was due to write this column I had the good fortune to be visiting the Caribbean island of Bequia, and very nice it was too, with sun, sea, sailing boats, flowers and tropical fruit. However so advanced is my pathological nerd-dom that the subject it inspires me to write about is fractal geometry, rather than gastronomy, fishing or the miraculous qualities of modern sailing vessels.
Actually to a proper nerd this connection is pretty straightforward. We sailed to Bequia across a sea covered in fractal waves and spattered with fractal foam and spray, under a sky full of fractal clouds. And the land is covered by a profusion of the most fractal plants imaginable, from palms to frangipanis to ferns and back again. In fact it was while inspecting some palm fronds on the beach that I was suddenly reminded of a book that impressed me very much when it came out a quarter of a century ago, called "The Algorithmic Beauty of Plants" by Przemyslaw Prusinkiewicz and Aristid Lindenmayer (Springer 1990). The authors, with expertise in mathematics, biology and computer graphics, set out to model the forms found in real plants using a system of fractal geometry call L-systems, which mimics the development of a plant. It operates with a smallish number of parameters that can be varied to produce a vast range of startlingly realistic plant forms - stems, leaves flowers and all.
Their key insight was that the form a plant exhibits is not a static fact but something dynamic, generated during the process of growth, and in this respect they brought the brilliant work of D'Arcy Thompson into the computer age. That insight can be summed up by saying that each form implicitly contains its own history. Of course Prusinkiewicz and Lindenmayer were only simulating such a history inside a computer, but the results are so realistic one can't help wondering whether they provide a clue to the way Nature itself does it.
Clearly Nature doesn't type in the algorithms that Prusinkiewicz and Lindenmayer describe in C++ code, nor even in pseudo-Pascal. There is no need to postulate a nerdish Intelligent Designer with beard, keyboard and Tux-the-penguin teeshirt. All that Nature has available to work with is chemistry, and dynamic chemistry at that. Such chemistry is now pretty well understood thanks to the likes of Ilya Prigogine, who explained how factors like gradients of concentration or temperature can cause a chemical system to oscillate regularly in time, like a clock. As a plant stem is sprouting biochemical processes inside each of its cells cause the levels of certain growth hormones to vary cyclically over *time*, with the result that leaves pop up in a *spatial* sequence along its length. Put another way, Nature is its own hybrid digital/analog computation system, in which the rates of such chemical cycles, following various power laws, cause behaviour that somewhat resembles Prusinkiewicz and Lindenmayer's algorithms.
And the way plants and animals vary those growth parameters is only very loosely determined by the quaternary digital code of their DNA. A class of genes called "homeobox", present in all multicellular lifeforms, determine only the broadest divisions within the creature's form, like its number of limbs or body segments - all the finer details get determined by these semi-autonomous chemical cycles and various epigenetic factors.
One of the stronger arguments the Creationist and Intelligent Design brigades can muster is that the fierce complication of the way nature looks and operates is too great to all be encoded statically in the finite amout of DNA. But in fact it doesn't have to be all so encoded. Indeed the whole metaphor of a designer working from a total blueprint misses the way that Nature actually works. Nature and evolution are dynamic, non-deterministic systems in which stuff continually happens and affects other stuff, and this couldn't possibly be captured in any static plan. The Deist notion of a God who just pressed the Start button and then withdrew forever is far closer to the truth than any active designer.
Nature's "blueprint" is more like a thick wad of blueprints for tiny clockwork protein machines that, when set to work, rush around interacting with one another and with their external environment, and the end result is all this marvellous beauty and diversity that we see. If you can get your head around the idea of (never the details of) such fantastically complex chains of causality, they are actually far more marvellous than any hypothesised Intelligent Designer. In fact having to invent such a creator, while useful and necessary during the infancy of our species, has nowadays become merely a lazy copout that insults our human ability to understand the world we live in.
The day I was due to write this column I had the good fortune to be visiting the Caribbean island of Bequia, and very nice it was too, with sun, sea, sailing boats, flowers and tropical fruit. However so advanced is my pathological nerd-dom that the subject it inspires me to write about is fractal geometry, rather than gastronomy, fishing or the miraculous qualities of modern sailing vessels.
Actually to a proper nerd this connection is pretty straightforward. We sailed to Bequia across a sea covered in fractal waves and spattered with fractal foam and spray, under a sky full of fractal clouds. And the land is covered by a profusion of the most fractal plants imaginable, from palms to frangipanis to ferns and back again. In fact it was while inspecting some palm fronds on the beach that I was suddenly reminded of a book that impressed me very much when it came out a quarter of a century ago, called "The Algorithmic Beauty of Plants" by Przemyslaw Prusinkiewicz and Aristid Lindenmayer (Springer 1990). The authors, with expertise in mathematics, biology and computer graphics, set out to model the forms found in real plants using a system of fractal geometry call L-systems, which mimics the development of a plant. It operates with a smallish number of parameters that can be varied to produce a vast range of startlingly realistic plant forms - stems, leaves flowers and all.
Their key insight was that the form a plant exhibits is not a static fact but something dynamic, generated during the process of growth, and in this respect they brought the brilliant work of D'Arcy Thompson into the computer age. That insight can be summed up by saying that each form implicitly contains its own history. Of course Prusinkiewicz and Lindenmayer were only simulating such a history inside a computer, but the results are so realistic one can't help wondering whether they provide a clue to the way Nature itself does it.
Clearly Nature doesn't type in the algorithms that Prusinkiewicz and Lindenmayer describe in C++ code, nor even in pseudo-Pascal. There is no need to postulate a nerdish Intelligent Designer with beard, keyboard and Tux-the-penguin teeshirt. All that Nature has available to work with is chemistry, and dynamic chemistry at that. Such chemistry is now pretty well understood thanks to the likes of Ilya Prigogine, who explained how factors like gradients of concentration or temperature can cause a chemical system to oscillate regularly in time, like a clock. As a plant stem is sprouting biochemical processes inside each of its cells cause the levels of certain growth hormones to vary cyclically over *time*, with the result that leaves pop up in a *spatial* sequence along its length. Put another way, Nature is its own hybrid digital/analog computation system, in which the rates of such chemical cycles, following various power laws, cause behaviour that somewhat resembles Prusinkiewicz and Lindenmayer's algorithms.
And the way plants and animals vary those growth parameters is only very loosely determined by the quaternary digital code of their DNA. A class of genes called "homeobox", present in all multicellular lifeforms, determine only the broadest divisions within the creature's form, like its number of limbs or body segments - all the finer details get determined by these semi-autonomous chemical cycles and various epigenetic factors.
One of the stronger arguments the Creationist and Intelligent Design brigades can muster is that the fierce complication of the way nature looks and operates is too great to all be encoded statically in the finite amout of DNA. But in fact it doesn't have to be all so encoded. Indeed the whole metaphor of a designer working from a total blueprint misses the way that Nature actually works. Nature and evolution are dynamic, non-deterministic systems in which stuff continually happens and affects other stuff, and this couldn't possibly be captured in any static plan. The Deist notion of a God who just pressed the Start button and then withdrew forever is far closer to the truth than any active designer.
Nature's "blueprint" is more like a thick wad of blueprints for tiny clockwork protein machines that, when set to work, rush around interacting with one another and with their external environment, and the end result is all this marvellous beauty and diversity that we see. If you can get your head around the idea of (never the details of) such fantastically complex chains of causality, they are actually far more marvellous than any hypothesised Intelligent Designer. In fact having to invent such a creator, while useful and necessary during the infancy of our species, has nowadays become merely a lazy copout that insults our human ability to understand the world we live in.
Subscribe to:
Posts (Atom)
POD PEOPLE
Dick Pountain /Idealog 366/ 05 Jan 2025 03:05 It’s January, when columnists feel obliged to reflect on the past year and who am I to refuse,...
-
Dick Pountain/Idealog 277/05 August 2017 11:05 I'm not overly prone to hero-worship, but that's not to say that I don't have a...
-
Dick Pountain /Idealog 360/ 07 Jul 2024 11:12 Astute readers <aside class=”smarm”> which of course to me means all of you </aside...
-
Dick Pountain /Idealog 363/ 05 Oct 2024 03:05 When I’m not writing this column, which let’s face it is most of the time, I perform a variety...