Dick Pountain/Idealog 242 /14 September 2014 13:35
One of the inescapable facts of life is that memory worsens as you get older. 40 years ago I could wake in the night with an idea and still remember it next morning. 30 years ago I put a notepad and pencil by my bedside to record ideas. 20 years ago I started trying to use mobile computers to take notes. I last wrote here exactly two years ago about how tablet computing was helping my perpetual quest (I confessed that text files stored on DropBox worked better for me than Evernote and its many rivals). So why revisit the topic just now? Well, because I just dictated this paragraph into my Nexus 7 using Google Keep and that still feels like fun.
I've played with dictation software for years, right from the earliest days of Dragon Dictate, but always found it more trouble than it was worth and practically unusable. So why is Google Keep any different? Mainly because I was already using it as my principal note-taker, as it syncs between my Nexus and my Yoga laptop with great success. I actually prefer Keep's simplistic "pinboard" visual metaphor to complex products like OneNote and Evernote that offer hierarchical folder structures, and its crude colour-coding scheme is remarkably useful. So when one day Google announced that I could now talk into Keep I tried and it just worked, transcribing my mumblings with remarkable accuracy and speed. Voice only works on Android, not on Windows, and it doesn't support any fancy editing commands (but who needs them for note taking)? Does that mean my 30-year quest is over and I've settled on one product? Er, actually no - I now have *three* rival note storage systems working at the same time, can't make a final choice between them and find myself struggling to remember whether I saved that tamale recipe into Keep, OneNote or Pocket. Doh...
The thing is, there are notes and notes. When I get an idea for, say, a future Idealog column, that's only a few dozen words that fit neatly onto a Google Keep "card". I colour these green to spot them more easily, though like everything Google, Keep is search-based so just typing "Ide" brings up all Idealog notes. On the other hand long chunks, or whole articles, clipped from magazines and newspapers stretch Keep's card metaphor beyond its usefulness (and its integration of pictures is rudimentary). For such items OneNote's hierarchical navigation becomes useful to separate different categories of content. Then there are whole web pages I want to save for instant reference, like recipes, maps or confirmation pages from travel sites. In theory I *could* send these straight to OneNote, or even paste them into Keep, but Pocket is way more convenient than either and works equally well from inside Chrome on Nexus, Yoga and phone (Chrome's Send To OneNote app doesn't work properly on my Yoga).
The fundamental problem is perhaps insoluble. Capturing fleeting ideas requires the fastest possible access: no more than one click is tolerable. But to find that idea again later you need structure, and structure means navigation, and navigation means clicks... Mu current combination is far, far better than any I've tried before - popping up a new Keep note or saving a Pocket page at a click is pretty good - but once I accumulate sufficient data the question of where I stored a particular item *will* become critical. This wouldn't be such a big deal if either Android or Windows 8 searches could see inside these applications, but they can't. Neither tablet search nor the otherwise impressive Win8 search will find stuff inside either Keep or OneNote, which isn't that surprising given that both apps store their data in the cloud rather than locally, and in different clouds owned by different firms who hate each other's guts.
On top of that there's the problem of different abilities within the same app on the different platforms. I've said Keep voice only works on Android, not on Windows (voice input also works in Google Search on Android, and tries to on Yoga but says it can't get permission to use the mike). OneNote on Android can record sound files but can't transcribe them, and though it syncs them with its Windows 8 app, the latter can't record and plays files clunkily in Windows Media Player. In short, it's a mess. Perhaps the paid-for, full version of OneNote is slicker, though I'm not greatly tempted to find out. Perhaps Google will soon enhance the abilities of Keep *without* rendering it slow and bloated. Perhaps there is a Big Rock Candy Mountain and somewhere among its lemonade fountains there's a software start-up that gets it about note taking...
My columns for PC Pro magazine, posted here six months in arrears for copyright reasons
Tuesday, 17 February 2015
OOP TO SCRATCH
Dick Pountain/Idealog 240/05 August 2014 10:34
So, 20 years of this column now, and so far I haven't run out of ideas. Of course one good way to achieve that is to keep banging on about the *same* ideas, and I plead guilty to that with a broad grin. My very first Idealog column in 1994 was entitled "OOPS Upside Your Head" (that prehistoric disco reference will be lost on the youth of today) and it expressed my faith in the long-term potential of object-oriented programming, along with my disgust at the misuse of the term by marketeers and the maladroit/cynical implementations by leading vendors like Microsoft, and this 240th column will be about object-oriented programming too, largely because I recently encountered a curious little OOP product that has renewed that faith.
The product is called Scratch and is intended to teach programming to very young children. At first sight it looks like Lego, thus neatly linking the topics of two of my other recent columns. You build programs by dragging different coloured "blocks" into a window, where they will only fit together in certain ways decided by their shapes, thus removing at a stroke the possibility of making syntax errors (probably biggest source of frustration for beginners). Some of these blocks are control structures, some are data objects and some are actions, and what powerful actions they are, a complete multimedia toolkit that lets you create animations with sound (warning: requires Flash) remarkably simply.
You'll not be too surprised to learn that Scratch was developed at MIT's Media Lab - started in 2003 by a team under Mitchel Resnick - and was originally implemented in a modern dialect of Smalltalk. The latest version 2 is written in Adobe's ActionScript. It's free to download from scratch.mit.edu and you use it online via your browser (though you can store the source for your projects locally too). Scratch is most certainly not suitable for professional development as it can only handle small, visually-oriented projects, but what grabbed me forcefully are the principles it demonstrates.
Scratch is based around a little-used style of object-orientation that employs cloning instead of class-based inheritance. Everything you build is an object called a "sprite" which contains all its own code. Every sprite is executable (it doesn't have to be visual) and one of the things it can do is spawn clones of itself at run-time. Clones behave like their parent, but have their own identity and all run concurrently: they're automatically destroyed and garbage-collected once execution ends. Scratch is also event-driven and supports events like keypresses, mouse clicks, sound-level thresholds, and message-passing between different sprites and their clones.
My first impression of Scratch was so Toy Town that I almost walked away, but then the old Byte training kicked in and nagged me to write The Sieve of Eratosthenes benchmark. It took me half an hour to grasp the principles, the program stretched to all of 10 "lines", looked like a colourful Lego picture, and required a leisurely 40 seconds to find the first 10000 primes. I rapidly knocked out some other old chestnuts like Fibonacci and Factorial to convince myself Scratch could do maths, then had the brainwave of reviving an abandoned Ruby project I called Critters, an animated ecosystem in which various bacteria-like species swim around eating each other, recognising their preferred prey by colour. I'd scrapped my Ruby version when the graphics became too tedious, but Scratch got an impressive version working inside an evening, thanks to predefined blocks that detect whether one sprite or colour is touching another, and a built-in sprite editor to draw the critters. That done, my other old favourite - an animated real-time bank queue simulation - submitted equally gracefully.
Scratch has several, deliberate, limitations. It supports lists and parameterised procedures, but neither are first-class objects that you can pass as parameters, which limits the level of abstraction you can achieve. Everything is tangible, thus overcoming another steep obstacle faced by novices (at the cost of failing to teach them abstraction). The only I/O is export and import of text files into lists (surprisingly usable) and the ability to talk to Arduino and Lego WeDo controller boards. While reading up about Scratch I discovered that a group at Berkeley has created a derivative called Snap! which extends Scratch by introducing first-class lists and local procedure variables. I duly tried it and it works well, but to my own amazement I actually prefer the challenges that Scratch poses to an experienced programmer! In our programming world of 2014, every development tool from C++ through Javascript to Python employs OOP, and I no longer need to defend the technique, but Scratch looks to me like by far the most fun way to teach it.
[Dick Pountain had rather hoped that his second childhood might no longer involve computers - dammit!]
So, 20 years of this column now, and so far I haven't run out of ideas. Of course one good way to achieve that is to keep banging on about the *same* ideas, and I plead guilty to that with a broad grin. My very first Idealog column in 1994 was entitled "OOPS Upside Your Head" (that prehistoric disco reference will be lost on the youth of today) and it expressed my faith in the long-term potential of object-oriented programming, along with my disgust at the misuse of the term by marketeers and the maladroit/cynical implementations by leading vendors like Microsoft, and this 240th column will be about object-oriented programming too, largely because I recently encountered a curious little OOP product that has renewed that faith.
The product is called Scratch and is intended to teach programming to very young children. At first sight it looks like Lego, thus neatly linking the topics of two of my other recent columns. You build programs by dragging different coloured "blocks" into a window, where they will only fit together in certain ways decided by their shapes, thus removing at a stroke the possibility of making syntax errors (probably biggest source of frustration for beginners). Some of these blocks are control structures, some are data objects and some are actions, and what powerful actions they are, a complete multimedia toolkit that lets you create animations with sound (warning: requires Flash) remarkably simply.
You'll not be too surprised to learn that Scratch was developed at MIT's Media Lab - started in 2003 by a team under Mitchel Resnick - and was originally implemented in a modern dialect of Smalltalk. The latest version 2 is written in Adobe's ActionScript. It's free to download from scratch.mit.edu and you use it online via your browser (though you can store the source for your projects locally too). Scratch is most certainly not suitable for professional development as it can only handle small, visually-oriented projects, but what grabbed me forcefully are the principles it demonstrates.
Scratch is based around a little-used style of object-orientation that employs cloning instead of class-based inheritance. Everything you build is an object called a "sprite" which contains all its own code. Every sprite is executable (it doesn't have to be visual) and one of the things it can do is spawn clones of itself at run-time. Clones behave like their parent, but have their own identity and all run concurrently: they're automatically destroyed and garbage-collected once execution ends. Scratch is also event-driven and supports events like keypresses, mouse clicks, sound-level thresholds, and message-passing between different sprites and their clones.
My first impression of Scratch was so Toy Town that I almost walked away, but then the old Byte training kicked in and nagged me to write The Sieve of Eratosthenes benchmark. It took me half an hour to grasp the principles, the program stretched to all of 10 "lines", looked like a colourful Lego picture, and required a leisurely 40 seconds to find the first 10000 primes. I rapidly knocked out some other old chestnuts like Fibonacci and Factorial to convince myself Scratch could do maths, then had the brainwave of reviving an abandoned Ruby project I called Critters, an animated ecosystem in which various bacteria-like species swim around eating each other, recognising their preferred prey by colour. I'd scrapped my Ruby version when the graphics became too tedious, but Scratch got an impressive version working inside an evening, thanks to predefined blocks that detect whether one sprite or colour is touching another, and a built-in sprite editor to draw the critters. That done, my other old favourite - an animated real-time bank queue simulation - submitted equally gracefully.
Scratch has several, deliberate, limitations. It supports lists and parameterised procedures, but neither are first-class objects that you can pass as parameters, which limits the level of abstraction you can achieve. Everything is tangible, thus overcoming another steep obstacle faced by novices (at the cost of failing to teach them abstraction). The only I/O is export and import of text files into lists (surprisingly usable) and the ability to talk to Arduino and Lego WeDo controller boards. While reading up about Scratch I discovered that a group at Berkeley has created a derivative called Snap! which extends Scratch by introducing first-class lists and local procedure variables. I duly tried it and it works well, but to my own amazement I actually prefer the challenges that Scratch poses to an experienced programmer! In our programming world of 2014, every development tool from C++ through Javascript to Python employs OOP, and I no longer need to defend the technique, but Scratch looks to me like by far the most fun way to teach it.
[Dick Pountain had rather hoped that his second childhood might no longer involve computers - dammit!]
Sunday, 11 January 2015
THE NUMBER OF EVERYTHING
Dick Pountain/Idealog 240 /09 July 2014 11:45
My interest in computing has always been closely connected to my love of maths. I excelled in maths at school and could have studied it instead of chemistry (where would I be now?) My first experience of computing was in a 1960 school project to build an analog machine that could solve sixth-order differential equations. I used to look for patterns in the distribution of primes rather than collect football cards - you're probably getting the picture. I still occasionally get the urge to mess with maths, as for example when I recently discovered Mathlab's marvellous Graphing Calculator for Android, and I'll sometimes scribble some Ruby code to solve a problem that's popped into my head.
Of course I've been enormously pleased recently to witness the British establishment finally recognising the genius of Alan Turing, after a disgracefully long delay. It was Turing, in his 1936 paper on computable numbers, who more than anyone forged the link between mathematics and computing, though it's for his crucial wartime cryptography that he's remembered by a wider public. While Turing was working on computable numbers at King's College Cambridge, a college friend of his David Champernowne, another rmathematical prodigy, was working on something rather different that's recently come to fascinate me. Champernowne soon quit maths for economics; studied under John Maynard Keynes; helped organise aircraft production during WWII; in 1948 helped Turing write one of the first chess-playing programs; and then wrote the definitive book on income distribution and inequality (which happens be another interest of mine and is how I found him). But what Champernowne did back in 1933 at college was to build a new number.
That number, called the Champernowne Constant, has some pretty remarkable properties, which I'll try to explain here fairly gently. The number is very easy to construct: you could write a few million decimal places of it this weekend if you're at a loose end. In base 10 it's just zero, a decimal point, followed by the decimal representations of each successive integer concatenated, hence:
0.12345678910111213141516171819202122232425262728293031....
It's an irrational real number whose representation goes on for ever, and it's also transcendental (like pi) which means it's not the root of any polynomial equation. What most interested Champernowne is that it's "normal", which means that each digit 0-9, and each pair, triple and so on of such digits appear in it equally often. That ensures that any number you can think of, of whatever length, will appear somewhere in its expansion (an infinite number of times actually). It's the number of everything, and it turns out to be far smaller (if somewhat longer) than Douglas Adams' famous 42.
Your phone number and bankcard PIN, and mine, are in there somewhere, so it's sort of like the NSA's database in that respect. Fortunately though, unlike the NSA, they're very, very hard to locate. The Unicode-encoded text of every book, play and poem ever written, in every language (plus an infinite number of versions with an infinite number of spelling mistakes) is in there somewhere too, as are the MPEG4 encodings of every film and TV programme ever made (don't bother looking). The names and addresses of everyone on earth, again in Unicode, are in there, along with those same names with the wrong addresses. Perhaps most disturbingly of all, every possible truncated approximation to Champerknowne's constant itself should be in there, an infinite number of times, though I'll confess I haven't checked.
Aficionados of the Latin-American fiction will immediately see that Champernowne's constant is the numeric equivalent to Jorge Luis Borges' famous short story "The Library of Babel", in which an infinite number of librarians traipse up and down an infinite spiral staircase connecting shelves of random texts, searching for a single sentence that makes sense. However Champernownes' is a rather more humane construct, since not only does it consume far less energy and shoe-leather, but it also avoids the frequent suicides -- by leaping down the stairwell -- that Borges imagined.
A quite different legend concerns an Indian temple at Kashi Vishwanath, where Brahmin priests were supposed to continually swap 64 golden disks of graded sizes between three pillars (following the rules of that puzzle better known to computer scientists as the "Tower of Hanoi"). When they complete the last move of this puzzle, it's said the world will end. It can be shown that for priests of average agility this will take around 585 billion years, but we could remove even that small risk by persuading them to substitute instead a short Ruby program that builds Champerknownes' constant (we'll need the BigDecimal module!) to be left running on a succession of PCs. Then we could be absolutely certain that while nothing gets missed out, the end will never arrive...
My interest in computing has always been closely connected to my love of maths. I excelled in maths at school and could have studied it instead of chemistry (where would I be now?) My first experience of computing was in a 1960 school project to build an analog machine that could solve sixth-order differential equations. I used to look for patterns in the distribution of primes rather than collect football cards - you're probably getting the picture. I still occasionally get the urge to mess with maths, as for example when I recently discovered Mathlab's marvellous Graphing Calculator for Android, and I'll sometimes scribble some Ruby code to solve a problem that's popped into my head.
Of course I've been enormously pleased recently to witness the British establishment finally recognising the genius of Alan Turing, after a disgracefully long delay. It was Turing, in his 1936 paper on computable numbers, who more than anyone forged the link between mathematics and computing, though it's for his crucial wartime cryptography that he's remembered by a wider public. While Turing was working on computable numbers at King's College Cambridge, a college friend of his David Champernowne, another rmathematical prodigy, was working on something rather different that's recently come to fascinate me. Champernowne soon quit maths for economics; studied under John Maynard Keynes; helped organise aircraft production during WWII; in 1948 helped Turing write one of the first chess-playing programs; and then wrote the definitive book on income distribution and inequality (which happens be another interest of mine and is how I found him). But what Champernowne did back in 1933 at college was to build a new number.
That number, called the Champernowne Constant, has some pretty remarkable properties, which I'll try to explain here fairly gently. The number is very easy to construct: you could write a few million decimal places of it this weekend if you're at a loose end. In base 10 it's just zero, a decimal point, followed by the decimal representations of each successive integer concatenated, hence:
0.12345678910111213141516171819202122232425262728293031....
It's an irrational real number whose representation goes on for ever, and it's also transcendental (like pi) which means it's not the root of any polynomial equation. What most interested Champernowne is that it's "normal", which means that each digit 0-9, and each pair, triple and so on of such digits appear in it equally often. That ensures that any number you can think of, of whatever length, will appear somewhere in its expansion (an infinite number of times actually). It's the number of everything, and it turns out to be far smaller (if somewhat longer) than Douglas Adams' famous 42.
Your phone number and bankcard PIN, and mine, are in there somewhere, so it's sort of like the NSA's database in that respect. Fortunately though, unlike the NSA, they're very, very hard to locate. The Unicode-encoded text of every book, play and poem ever written, in every language (plus an infinite number of versions with an infinite number of spelling mistakes) is in there somewhere too, as are the MPEG4 encodings of every film and TV programme ever made (don't bother looking). The names and addresses of everyone on earth, again in Unicode, are in there, along with those same names with the wrong addresses. Perhaps most disturbingly of all, every possible truncated approximation to Champerknowne's constant itself should be in there, an infinite number of times, though I'll confess I haven't checked.
Aficionados of the Latin-American fiction will immediately see that Champernowne's constant is the numeric equivalent to Jorge Luis Borges' famous short story "The Library of Babel", in which an infinite number of librarians traipse up and down an infinite spiral staircase connecting shelves of random texts, searching for a single sentence that makes sense. However Champernownes' is a rather more humane construct, since not only does it consume far less energy and shoe-leather, but it also avoids the frequent suicides -- by leaping down the stairwell -- that Borges imagined.
A quite different legend concerns an Indian temple at Kashi Vishwanath, where Brahmin priests were supposed to continually swap 64 golden disks of graded sizes between three pillars (following the rules of that puzzle better known to computer scientists as the "Tower of Hanoi"). When they complete the last move of this puzzle, it's said the world will end. It can be shown that for priests of average agility this will take around 585 billion years, but we could remove even that small risk by persuading them to substitute instead a short Ruby program that builds Champerknownes' constant (we'll need the BigDecimal module!) to be left running on a succession of PCs. Then we could be absolutely certain that while nothing gets missed out, the end will never arrive...
I, ROBOT?
Dick Pountain/ Idealog 239/ 06 June 2014 09:56
Like many males of my generation I grew up fairly well-disposed toward the robot. Robbie the Robot filmstar was all the rage when I was 11, and Asimov's Laws of Robotics engaged my attention as a teenaged sci-fi reader. By the time I became involved in publishing underground comics in the early 1970s the cuteness was wearing off robots, but even so the threat was moderated by humour. The late Vaughn Bodé - nowadays beloved by all the world's graffiti artists - drew a strip called "Junkwaffel" that depicted a world cleansed of humans but gripped in permanent war between foul-mouthed, wise-cracking robot soldiers. In some ways these were the (far rougher) prototypes of R2D2 and C3PO.
Out in the real world robots started to appear on factory production lines, but they were doing those horrible jobs that humans shouldn't do, like spraying cellulose paint, and humans were still being employed to do the other stuff. When I got involved in computer programming myself I was drawn toward robotics thanks to an interest in Forth, a language originally invented to control observatory telescopes and ideally suited to robot programming. The problems of robots back then were all about *training* them to perform desired motions (as opposed to spelling out in X,Y,Z coordinates) and building-in enough intelligence to give them more and more autonomy. I still vividly remember my delight when a roboticist friend at Bristol Uni showed me robot ducklings they'd built that followed each other just like the real thing, using vision alone.
Given this background, it will come rather hard to have to change my disposition toward the robot, but events in today's world are conspiring to force me to do just that. While reading a recent issue of New Scientist (26 April 2014), I was struck by two wholly unrelated articles that provide a powerful incentive for such a change of attitude. The first of these involved the Russian Strategic Missile Force, which has for the first time deliberately violated Asimov's main law by building a fully-autonomous lethal robot that requires no permission from a human to kill.
The robot in question is a bit aesthetically disappointing in that it's not even vaguely humanoid-looking: it looks like, indeed *is*, a small armoured car on caterpillar tracks that wields a 12.7mm heavy machine gun under radar, camera and laser control. It's being deployed to guard missile sites, and will open fire if it sees someone it doesn't like the look of. I do hope it isn't using a Windows 8 app for a brain. Whatever your views on the morality of the US drone fleet, it's important to realise that this is something quite different. Drones are remotely controlled by humans, and can only fire their weapons on command from a human, who must make all the necessary tactical and moral decisions. The Russian robot employs an algorithm to make those decisions. Imagine being held-up at gunpoint by Siri and you'll get the difference.
However it was the other article that profoundly upset my digestive system, an interview with Andrew McAfee, research scientist at MIT's Center for Digital Business. Asked by interviewer Niall Firth "Are robots really taking our jobs?", McAfee replied with three possible scenarios: first, that robots will in the short term, but a new equilibrium will be reached as it was after the first Industrial Revolution; second, they'll replace more and more professions and massive retraining will be essential to keep up; third, the sci-fi-horror scenario where robots can perform almost all jobs and "you just don't need a lot of labour". He thinks we'll see scenario three in his lifetime (which I hope and trust will be longer than mine).
It was when he was then asked about any possible upside that my mind boggled and my gorge rose: the "bounty" he saw arising was a greater variety of stuff of higher quality at lower prices, and most importantly "you don't need money to buy access to Instagram, Facebook or Wikipedia". That's just as well really, since no-one except the 0.1% who own the robots will have any money. On that far-off day I forsee when a guillotine (of 3D-printed stainless steel) has been erected outside Camden Town tube-station, McAfee may be still remembered as a 21st-century Marie Antoinette for that line.
The bottom line is that robots are still really those engaging toys-for-boys that I fell for back in the 1950s, but economics and politics require the presence of grown-ups. Regrettably the supply of grown-ups has been dwindling alarmingly since John Maynard Keynes saved us from such imbecilities the last time around. If you're going to make stuff, you have to pay people enough to buy that stuff, simples.
Like many males of my generation I grew up fairly well-disposed toward the robot. Robbie the Robot filmstar was all the rage when I was 11, and Asimov's Laws of Robotics engaged my attention as a teenaged sci-fi reader. By the time I became involved in publishing underground comics in the early 1970s the cuteness was wearing off robots, but even so the threat was moderated by humour. The late Vaughn Bodé - nowadays beloved by all the world's graffiti artists - drew a strip called "Junkwaffel" that depicted a world cleansed of humans but gripped in permanent war between foul-mouthed, wise-cracking robot soldiers. In some ways these were the (far rougher) prototypes of R2D2 and C3PO.
Out in the real world robots started to appear on factory production lines, but they were doing those horrible jobs that humans shouldn't do, like spraying cellulose paint, and humans were still being employed to do the other stuff. When I got involved in computer programming myself I was drawn toward robotics thanks to an interest in Forth, a language originally invented to control observatory telescopes and ideally suited to robot programming. The problems of robots back then were all about *training* them to perform desired motions (as opposed to spelling out in X,Y,Z coordinates) and building-in enough intelligence to give them more and more autonomy. I still vividly remember my delight when a roboticist friend at Bristol Uni showed me robot ducklings they'd built that followed each other just like the real thing, using vision alone.
Given this background, it will come rather hard to have to change my disposition toward the robot, but events in today's world are conspiring to force me to do just that. While reading a recent issue of New Scientist (26 April 2014), I was struck by two wholly unrelated articles that provide a powerful incentive for such a change of attitude. The first of these involved the Russian Strategic Missile Force, which has for the first time deliberately violated Asimov's main law by building a fully-autonomous lethal robot that requires no permission from a human to kill.
The robot in question is a bit aesthetically disappointing in that it's not even vaguely humanoid-looking: it looks like, indeed *is*, a small armoured car on caterpillar tracks that wields a 12.7mm heavy machine gun under radar, camera and laser control. It's being deployed to guard missile sites, and will open fire if it sees someone it doesn't like the look of. I do hope it isn't using a Windows 8 app for a brain. Whatever your views on the morality of the US drone fleet, it's important to realise that this is something quite different. Drones are remotely controlled by humans, and can only fire their weapons on command from a human, who must make all the necessary tactical and moral decisions. The Russian robot employs an algorithm to make those decisions. Imagine being held-up at gunpoint by Siri and you'll get the difference.
However it was the other article that profoundly upset my digestive system, an interview with Andrew McAfee, research scientist at MIT's Center for Digital Business. Asked by interviewer Niall Firth "Are robots really taking our jobs?", McAfee replied with three possible scenarios: first, that robots will in the short term, but a new equilibrium will be reached as it was after the first Industrial Revolution; second, they'll replace more and more professions and massive retraining will be essential to keep up; third, the sci-fi-horror scenario where robots can perform almost all jobs and "you just don't need a lot of labour". He thinks we'll see scenario three in his lifetime (which I hope and trust will be longer than mine).
It was when he was then asked about any possible upside that my mind boggled and my gorge rose: the "bounty" he saw arising was a greater variety of stuff of higher quality at lower prices, and most importantly "you don't need money to buy access to Instagram, Facebook or Wikipedia". That's just as well really, since no-one except the 0.1% who own the robots will have any money. On that far-off day I forsee when a guillotine (of 3D-printed stainless steel) has been erected outside Camden Town tube-station, McAfee may be still remembered as a 21st-century Marie Antoinette for that line.
The bottom line is that robots are still really those engaging toys-for-boys that I fell for back in the 1950s, but economics and politics require the presence of grown-ups. Regrettably the supply of grown-ups has been dwindling alarmingly since John Maynard Keynes saved us from such imbecilities the last time around. If you're going to make stuff, you have to pay people enough to buy that stuff, simples.
THE JOY OF CODING?
Dick Pountain/ Idealog 238/ 08 May 2014 19:30
I've admitted many times in this column that I actually enjoy programming, and mostly do it for fun. In fact I far prefer programming to playing games. Given my other interests, people are often surprised that I don't enjoy chess, but the truth is that the sort of problems it creates don't interest me: I simply can't be bothered to hurt my brain thinking seven moves ahead when all that's at stake is beating the other guy. I did enjoy playing with Meccano as a kid, and did make that travelling gantry crane. I can even imagine the sort of satisfaction that might arise from building Chartres Cathedral out of Lego, though having children myself rendered me phobic about the sound of spilling Lego bricks (and the pain of stepping on one in bare feet). But programming is the ultimate construction game, where your opponent is neither person nor computer but the complexity of reality itself.
Object-oriented programming is especially rewarding that way. You can simulate anything you can imagine, describe its properties and its behaviours, then - by typing a single line of code - create a thousand (or a million) copies of it and set them all working. Then call that whole system an object and create a hundred copies of that. It's all stuff you can't do in the heavy, inertial, expensive world of matter: making plastic bits and pieces by 3D printing may be practical, even useful, but it lacks this Creator of Worlds buzz.
Since I'm so besotted by programming as recreation, I must surely be very excited by our government's "Year of Code" initiative, which aims to teach all our children how to write programs - or about "coding" as the current irritating locution would have it? Actually, no I'm not. I'm perfectly well aware that my taste for programming as recreation is pretty unusual, very far from universal, perhaps even eccentric, a bit like Base Jumping or worm farming. The idea that every child in the country is going to develop a such taste is ludicrous, and that rules out coding for pleasure as a rationale. It will most likely prove as unpleasant as maths to a lot of kids, and put them off for life.
But what about "coding" as job skill, as vital life equipment for gaining employment in our new digital era? Well there's certainly a need for a lot of programmers, and the job does pay well above average. However you can say much the same about plumbers, electricians and motor mechanics, and no-one is suggesting that all children should be taught those skills. The aim is to train school teachers to teach coding, but it makes no more sense for every child to learn programming than it does to wire up a ring-main or install a cistern. Someone who decides to pursue programming as a profession needs solid tuition in maths and perhaps physics, plus the most basic principles of programming like iteration and conditionality which ought to be part of the maths curriculum anyway. Actual programming in real languages is for tertiary education, not for the age of five as the Year of Code seeks.
The whole affair reeks of the kind of gimmicky policy a bunch of arts and humanities graduates, clueless about technology, might think up after getting an iPad for Christmas and being bowled over by the wonderful new world of digital communications. Their kids probably already know more about "coding" than they do via self-tuition. However there are those who detect a more sinister odour in the air. For example Andrew Orlowski, curmudgeon-in-chief at The Register, has pointed out a network of training companies and consultants who stand to make big bucks out of the Year of Code, in much the same way firms did during the Y2K panic: they include venture capital company Index Ventures, which has Year of Code's chairman Rohan Silva as its "Entrepreneur in Residence", and US training company Codecademy. Organisations that are already reaching children who are actually interested in programming, like the Raspberry Pi foundation, appear to be sidelined and cold-shouldered by the hype merchants: the foundation's development director Clive Beale has claimed that "The word 'coding' has been hijacked and abused by politicians and media who don't understand stuff”.
Personally I'd love to believe all children could be taught to gain as much pleasure from programming as I do, but it's unlikely. Like singing, dancing, drawing, playing football, some can do it and like it, others can't and won't, and the notion that everyone has to be able do it for the sake of the national economy has unpleasantly Maoist undertones, with backyard code foundries instead of steelworks.
I've admitted many times in this column that I actually enjoy programming, and mostly do it for fun. In fact I far prefer programming to playing games. Given my other interests, people are often surprised that I don't enjoy chess, but the truth is that the sort of problems it creates don't interest me: I simply can't be bothered to hurt my brain thinking seven moves ahead when all that's at stake is beating the other guy. I did enjoy playing with Meccano as a kid, and did make that travelling gantry crane. I can even imagine the sort of satisfaction that might arise from building Chartres Cathedral out of Lego, though having children myself rendered me phobic about the sound of spilling Lego bricks (and the pain of stepping on one in bare feet). But programming is the ultimate construction game, where your opponent is neither person nor computer but the complexity of reality itself.
Object-oriented programming is especially rewarding that way. You can simulate anything you can imagine, describe its properties and its behaviours, then - by typing a single line of code - create a thousand (or a million) copies of it and set them all working. Then call that whole system an object and create a hundred copies of that. It's all stuff you can't do in the heavy, inertial, expensive world of matter: making plastic bits and pieces by 3D printing may be practical, even useful, but it lacks this Creator of Worlds buzz.
Since I'm so besotted by programming as recreation, I must surely be very excited by our government's "Year of Code" initiative, which aims to teach all our children how to write programs - or about "coding" as the current irritating locution would have it? Actually, no I'm not. I'm perfectly well aware that my taste for programming as recreation is pretty unusual, very far from universal, perhaps even eccentric, a bit like Base Jumping or worm farming. The idea that every child in the country is going to develop a such taste is ludicrous, and that rules out coding for pleasure as a rationale. It will most likely prove as unpleasant as maths to a lot of kids, and put them off for life.
But what about "coding" as job skill, as vital life equipment for gaining employment in our new digital era? Well there's certainly a need for a lot of programmers, and the job does pay well above average. However you can say much the same about plumbers, electricians and motor mechanics, and no-one is suggesting that all children should be taught those skills. The aim is to train school teachers to teach coding, but it makes no more sense for every child to learn programming than it does to wire up a ring-main or install a cistern. Someone who decides to pursue programming as a profession needs solid tuition in maths and perhaps physics, plus the most basic principles of programming like iteration and conditionality which ought to be part of the maths curriculum anyway. Actual programming in real languages is for tertiary education, not for the age of five as the Year of Code seeks.
The whole affair reeks of the kind of gimmicky policy a bunch of arts and humanities graduates, clueless about technology, might think up after getting an iPad for Christmas and being bowled over by the wonderful new world of digital communications. Their kids probably already know more about "coding" than they do via self-tuition. However there are those who detect a more sinister odour in the air. For example Andrew Orlowski, curmudgeon-in-chief at The Register, has pointed out a network of training companies and consultants who stand to make big bucks out of the Year of Code, in much the same way firms did during the Y2K panic: they include venture capital company Index Ventures, which has Year of Code's chairman Rohan Silva as its "Entrepreneur in Residence", and US training company Codecademy. Organisations that are already reaching children who are actually interested in programming, like the Raspberry Pi foundation, appear to be sidelined and cold-shouldered by the hype merchants: the foundation's development director Clive Beale has claimed that "The word 'coding' has been hijacked and abused by politicians and media who don't understand stuff”.
Personally I'd love to believe all children could be taught to gain as much pleasure from programming as I do, but it's unlikely. Like singing, dancing, drawing, playing football, some can do it and like it, others can't and won't, and the notion that everyone has to be able do it for the sake of the national economy has unpleasantly Maoist undertones, with backyard code foundries instead of steelworks.
Tuesday, 23 September 2014
SOMETHING WENT WRONG
Dick Pountain/ Idealog 237/ 07 April 2014 11:48
I'm writing this column on a new computer on 8th April, which may or may not come to be known as Save Microsoft's Ass Day. For the benefit of any members of exotic tribes, like OSX or Linux users, it's the day on which the major update to Windows 8.1 is being released, the fate of which might determine the fate of Microsoft. I don't have the update myself, and will be waiting to see whether it bricks everyone's PCs before I indulge (in itself a testament to the esteem in which MS is currently held).
But, I hear you thinking, didn't he say just a few columns ago that he may never buy another Windows PC? He did indeed, but then he succumbed to an unforgiveable fit of nostalgia and misplaced loyalty and did precisely that. No sooner had I written that previous column than the hard disk on my trusty 7-year-old Viao began to show symptoms of approaching retirement, and it became wise to put my money where my mouth had been. I shopped around, was sorely tempted to "go commando" with an Asus Transformer, and devilishly tempted by the sheer beauty of Google's hi-def Chromebook, but in the end I stuck with Windows for a combination of pragmatic and sentimental reasons. I'm already beginning to regret it.
The pragmatic reason was that I don't completely trust the cloud. I'm happy enough to exploit it and have done business from foreign shores using only my Nexus 7 and an internet connection, but I want a local copy of all my significant data too. Android can sort of hack that, but it's not what it was designed for (the fact that a file manager is a third-party app gives you the clue). The sentimental reason was the 20 years I've spent fiddling with, tweaking, boosting, wrestling, writing software for, swearing at and rescuing Windows. Hate it, certainly, but it was also great mental exercise on a par with playing chess or planning a guerilla war. So I caved, bought a Lenovo Yoga 2 running Windows 8 and plunged ahead into the quagmire. I resolved to upgrade immediately to Windows 8.1 before getting too settled (ha!) so went to the Store, only to find no upgrade there. A traipse through the forums revealed that it won't be visible on some PCs until you manually apply two other upgrades called K123blah and K123blech. So far so hostile, in fact downright unprofessional, by both MS and Lenovo.
With 8.1 in place I started to investigate the Metrosexual interface and found I didn't mind it so much as many other commentators, since I'm now totally atuned to Android and touch. Tiles make quite a good substitute for the Start Menu I never used, having always preferred desktop icons. Things I do most - mail, calendar, writing, reading, Googling and note-taking - all fit onto the first Start screen, always available via the Windows key as I work in desktop view. But irritation set in once I discovered there aren't any official "Modern Interface" versions of most of the programs I use (like Gmail, Blogger, Flickr, YouTube, iPlayer). You can fiddle this by viewing them in a browser window and making tiles from their URLs, if you don't mind using Internet Explorer, which I do mind. Using Firefox, as I used to, you can't (and in any case it runs too slowly to be usable). Using Chrome, as I now do, it's hidden under a menu that another traipse through the forums revealed. Then one-by-one, those tiled Win8 apps I could find started to break down. It happened to the Facebook app, to a calendar app I paid, admittedly pence, for which no longer logs into Google, and to an icon editor that no longer saves its files. What's really nice though is that to avoid giving anxiety and offence (a cardinal sin in the modern world) software vendors are adopting a new, softer and content-free error message: "Something Went Wrong". No shit Sherlock.
As I edit Jon Honeyball's column each month I quail before his volcanic wrath against the Windows 8 ecosystem, but I now realise Jon has actually been pretty moderate: the quality of apps in the Windows Store is so abysmal it actually makes me nervous, like wandering down the wrong alley in an unfamiliar city and seeing people lurk in dark doorways. Win8 apps now cause me the same unease I feel whenever forced to use a Macintosh, a loss of control and bewilderment at its sheer opacity. Fortunately the Google Apps button in Chrome lets me turn my cute little Yoga into something resembling the Chromebook I almost bought! Windows 8.1 update will need to be *really, really* something...
I'm writing this column on a new computer on 8th April, which may or may not come to be known as Save Microsoft's Ass Day. For the benefit of any members of exotic tribes, like OSX or Linux users, it's the day on which the major update to Windows 8.1 is being released, the fate of which might determine the fate of Microsoft. I don't have the update myself, and will be waiting to see whether it bricks everyone's PCs before I indulge (in itself a testament to the esteem in which MS is currently held).
But, I hear you thinking, didn't he say just a few columns ago that he may never buy another Windows PC? He did indeed, but then he succumbed to an unforgiveable fit of nostalgia and misplaced loyalty and did precisely that. No sooner had I written that previous column than the hard disk on my trusty 7-year-old Viao began to show symptoms of approaching retirement, and it became wise to put my money where my mouth had been. I shopped around, was sorely tempted to "go commando" with an Asus Transformer, and devilishly tempted by the sheer beauty of Google's hi-def Chromebook, but in the end I stuck with Windows for a combination of pragmatic and sentimental reasons. I'm already beginning to regret it.
The pragmatic reason was that I don't completely trust the cloud. I'm happy enough to exploit it and have done business from foreign shores using only my Nexus 7 and an internet connection, but I want a local copy of all my significant data too. Android can sort of hack that, but it's not what it was designed for (the fact that a file manager is a third-party app gives you the clue). The sentimental reason was the 20 years I've spent fiddling with, tweaking, boosting, wrestling, writing software for, swearing at and rescuing Windows. Hate it, certainly, but it was also great mental exercise on a par with playing chess or planning a guerilla war. So I caved, bought a Lenovo Yoga 2 running Windows 8 and plunged ahead into the quagmire. I resolved to upgrade immediately to Windows 8.1 before getting too settled (ha!) so went to the Store, only to find no upgrade there. A traipse through the forums revealed that it won't be visible on some PCs until you manually apply two other upgrades called K123blah and K123blech. So far so hostile, in fact downright unprofessional, by both MS and Lenovo.
With 8.1 in place I started to investigate the Metrosexual interface and found I didn't mind it so much as many other commentators, since I'm now totally atuned to Android and touch. Tiles make quite a good substitute for the Start Menu I never used, having always preferred desktop icons. Things I do most - mail, calendar, writing, reading, Googling and note-taking - all fit onto the first Start screen, always available via the Windows key as I work in desktop view. But irritation set in once I discovered there aren't any official "Modern Interface" versions of most of the programs I use (like Gmail, Blogger, Flickr, YouTube, iPlayer). You can fiddle this by viewing them in a browser window and making tiles from their URLs, if you don't mind using Internet Explorer, which I do mind. Using Firefox, as I used to, you can't (and in any case it runs too slowly to be usable). Using Chrome, as I now do, it's hidden under a menu that another traipse through the forums revealed. Then one-by-one, those tiled Win8 apps I could find started to break down. It happened to the Facebook app, to a calendar app I paid, admittedly pence, for which no longer logs into Google, and to an icon editor that no longer saves its files. What's really nice though is that to avoid giving anxiety and offence (a cardinal sin in the modern world) software vendors are adopting a new, softer and content-free error message: "Something Went Wrong". No shit Sherlock.
As I edit Jon Honeyball's column each month I quail before his volcanic wrath against the Windows 8 ecosystem, but I now realise Jon has actually been pretty moderate: the quality of apps in the Windows Store is so abysmal it actually makes me nervous, like wandering down the wrong alley in an unfamiliar city and seeing people lurk in dark doorways. Win8 apps now cause me the same unease I feel whenever forced to use a Macintosh, a loss of control and bewilderment at its sheer opacity. Fortunately the Google Apps button in Chrome lets me turn my cute little Yoga into something resembling the Chromebook I almost bought! Windows 8.1 update will need to be *really, really* something...
Sunday, 10 August 2014
THOSE PESKY ATOMS
Dick Pountain/PC Pro/Idealog 236 06/03/2014
Foot-to-ground contact is pretty important to us motorcyclists so we get picky about our boots. I favour an Australian stockman's style that can pass for a fashionable Chelsea Boot in polite society. Having worn out my second pair of R.M.Williams after 15 years yesterday I went shopping for new. I checked Russell & Bromley and John Lewis on the web, then set off to town to try some on. Why didn't I buy online? Because I need to try boots on, and you can neither upload your feet nor download boots over the web, which still only handles bits, not pesky atoms. I'd never consider buying boots from Amazon, though I did go there after my purchase to snivel quietly about the £8 I could have saved...
Russell B's lovely boots were too narrow for my broad feet and John Lewis didn't have the ones advertised on their website, so I ended up buying Blundstones (which are fab and half the price of R.M.Williams) from the Natural Shoe Store. Later that day I realised there's a moral to this gripping tale, as I was reading John Lewis's announcement of its massive new IT project: an Oracle-based ERP (Enterprise Resource Planning) system that will completely integrate the firms' online and physical stores, including all the Waitrose grocers. No cost was quoted for this four-year project - scheduled to run in 2017 - but it will certainly be both expensive and risky.
Manufacturers have only a slightly better record than public-sector institutions when it comes to screwing up big IT: in recent years major corporations from Avon to Hershey and Levi's Jeans have lost fortunes botching or cancelling ERP projects. If anyone can pull it off it might be John Lewis, whose overall competence is renowned. It first pioneered Click & Collect, where you choose a product on the website and then collect it from your nearest store, though all its competitors now do the same. But C&C is only one permutation people use to bridge the bits/atoms gap. Some folk research products in the bricks-and-mortar store, then go home and order from the website. Some fear online shopping and prefer to order by phone from a human. As for browsing the site, they might use a mobile, tablet or a PC. Hence the new buzzword is "omni-channel", and it matters enormously because all of these modes of e-commerce will fail - like my boot purchase - if stock in stores isn't accurately reflected on the website. That demands a whole new level of integration of stock-control and delivery systems, which for a grocery operation like Waitrose that delivers perishable foodstuffs will be ferociously hard. The new project is ambitious indeed.
This is clearly the new frontline of online retailing. There are more and more items like TV sets, clothes, shoes, high-end acoustic musical instruments, possibly furniture and fabrics, that people won't be satisfied to buy from a purely online outlet like Amazon but need to see and touch before choosing. Admittedly a lot of people go to bricks-and-mortar stores to browse, then go home an buy from Amazon, but the stores are getting wise to this. I imagine that John Lewis's new system, assuming it works, is intended to make it so easy to buy-as-you-handle that you won't want Amazon. Meanwhile Amazon and Google are both leaking weirdly futuristic plans for delivering atoms rather than bits independently of the Post Office or courier service. Amazon's vision involves flocks of quadcopter drones, delivering your purchases down the chimney like the storks of legend. Google, with its feet more firmly on the ground, buys up robotics firms: I particularly like their galloping headless-heifer robot, which would make quite a stir as it rumbled round the corner into our street towing a sofa (especially if chased by a squawking flock of quadcopters... )
Omens are gathering that the power of silicon valley giants has peaked, just as the oil, coal and railway barons' power did in the 1900s: even the US Right is getting chippy about the amount of tax they avoid (which means taking more tax from civilians); among Democrats there are populist stirrings about their almost-jobless business model and exploitation of interns; and the San Francisco bus protests are seriously tarnishing their public image. And all that Kurtzweilian Transhumanist/Matrix/Singularity nonsense looks more and more like a religious cult, a post-modern reinvention of a Protestant Millennium. We might spend a lot of time watching movies and listening to music in bit-land but we're never going to live there full-time because we're made of atoms, we eat atoms, breath atoms and wear atoms. And bricks-and-mortar shops have a head start when it comes to distributing atoms in bulk: just watch them start the fight back.
Foot-to-ground contact is pretty important to us motorcyclists so we get picky about our boots. I favour an Australian stockman's style that can pass for a fashionable Chelsea Boot in polite society. Having worn out my second pair of R.M.Williams after 15 years yesterday I went shopping for new. I checked Russell & Bromley and John Lewis on the web, then set off to town to try some on. Why didn't I buy online? Because I need to try boots on, and you can neither upload your feet nor download boots over the web, which still only handles bits, not pesky atoms. I'd never consider buying boots from Amazon, though I did go there after my purchase to snivel quietly about the £8 I could have saved...
Russell B's lovely boots were too narrow for my broad feet and John Lewis didn't have the ones advertised on their website, so I ended up buying Blundstones (which are fab and half the price of R.M.Williams) from the Natural Shoe Store. Later that day I realised there's a moral to this gripping tale, as I was reading John Lewis's announcement of its massive new IT project: an Oracle-based ERP (Enterprise Resource Planning) system that will completely integrate the firms' online and physical stores, including all the Waitrose grocers. No cost was quoted for this four-year project - scheduled to run in 2017 - but it will certainly be both expensive and risky.
Manufacturers have only a slightly better record than public-sector institutions when it comes to screwing up big IT: in recent years major corporations from Avon to Hershey and Levi's Jeans have lost fortunes botching or cancelling ERP projects. If anyone can pull it off it might be John Lewis, whose overall competence is renowned. It first pioneered Click & Collect, where you choose a product on the website and then collect it from your nearest store, though all its competitors now do the same. But C&C is only one permutation people use to bridge the bits/atoms gap. Some folk research products in the bricks-and-mortar store, then go home and order from the website. Some fear online shopping and prefer to order by phone from a human. As for browsing the site, they might use a mobile, tablet or a PC. Hence the new buzzword is "omni-channel", and it matters enormously because all of these modes of e-commerce will fail - like my boot purchase - if stock in stores isn't accurately reflected on the website. That demands a whole new level of integration of stock-control and delivery systems, which for a grocery operation like Waitrose that delivers perishable foodstuffs will be ferociously hard. The new project is ambitious indeed.
This is clearly the new frontline of online retailing. There are more and more items like TV sets, clothes, shoes, high-end acoustic musical instruments, possibly furniture and fabrics, that people won't be satisfied to buy from a purely online outlet like Amazon but need to see and touch before choosing. Admittedly a lot of people go to bricks-and-mortar stores to browse, then go home an buy from Amazon, but the stores are getting wise to this. I imagine that John Lewis's new system, assuming it works, is intended to make it so easy to buy-as-you-handle that you won't want Amazon. Meanwhile Amazon and Google are both leaking weirdly futuristic plans for delivering atoms rather than bits independently of the Post Office or courier service. Amazon's vision involves flocks of quadcopter drones, delivering your purchases down the chimney like the storks of legend. Google, with its feet more firmly on the ground, buys up robotics firms: I particularly like their galloping headless-heifer robot, which would make quite a stir as it rumbled round the corner into our street towing a sofa (especially if chased by a squawking flock of quadcopters... )
Omens are gathering that the power of silicon valley giants has peaked, just as the oil, coal and railway barons' power did in the 1900s: even the US Right is getting chippy about the amount of tax they avoid (which means taking more tax from civilians); among Democrats there are populist stirrings about their almost-jobless business model and exploitation of interns; and the San Francisco bus protests are seriously tarnishing their public image. And all that Kurtzweilian Transhumanist/Matrix/Singularity nonsense looks more and more like a religious cult, a post-modern reinvention of a Protestant Millennium. We might spend a lot of time watching movies and listening to music in bit-land but we're never going to live there full-time because we're made of atoms, we eat atoms, breath atoms and wear atoms. And bricks-and-mortar shops have a head start when it comes to distributing atoms in bulk: just watch them start the fight back.
Subscribe to:
Posts (Atom)
POD PEOPLE
Dick Pountain /Idealog 366/ 05 Jan 2025 03:05 It’s January, when columnists feel obliged to reflect on the past year and who am I to refuse,...
-
Dick Pountain/Idealog 277/05 August 2017 11:05 I'm not overly prone to hero-worship, but that's not to say that I don't have a...
-
Dick Pountain /Idealog 360/ 07 Jul 2024 11:12 Astute readers <aside class=”smarm”> which of course to me means all of you </aside...
-
Dick Pountain /Idealog 363/ 05 Oct 2024 03:05 When I’m not writing this column, which let’s face it is most of the time, I perform a variety...