Tuesday, 3 July 2012

GO NERDS, GO

Dick Pountain/PC Pro/Idealog 202/11/05/2011

I've been a nerd, and proud of it, more or less since I could speak. I decided I wanted to be scientist at around 9 years old, and loathed sport at school with a deep passion. (During the 1960s I naively believed the counterculture was an alliance of everyone who hated sport, until all my friends came out as closet football fans in 1966). However my true nerdly status was only properly recognised a week ago when,: totally frazzled by wrestling with Windows 7 drivers, for a diversion I clicked on a link to www.nerdtests.com and filled in the questionnaire. It granted me the rank of Uber Cool Nerd King, and no award has ever pleased me more (a Nobel might, but I fear I've left that a bit too late).

So what exactly constitutes nerdhood? To Hollywood a nerd is anyone in thick-rimmed spectacles with a vocabulary of more than 1000 words, some of which have more than three syllables - the opposite of a jock or frat-boy. What a feeble stereotype. In the IT world a nerd is anyone who knows what a .INF file is for and can use a command prompt without their hands shaking, but that's still a bit populist for an Uber Cool Nerd King. "Developers" who know at least four programming languages might be admitted to the lower echelons, but true nerd aristocracy belongs only to those with a deep knowledge and love of programming language *design*. If you've ever arm-wrestled someone to solve a dispute over late versus early binding, you might just be a candidate.

In the late 1980s and early '90s I steeped myself in programming language design. I could write 14 different languages, some leading edge like Oberon, Occam, Prolog and POP-11. I wrote books on object-orientation and coded up my own recursive-descent parsers. I truly believed we were on the verge making programming as easy and fun as building Lego models, if only we could combine Windows' graphical abilities with grown-up languages that supported lists, closures, concurrency, garbage collection and so on. That never happened because along came the Web and the Dot Com boom, and between them they dumbed everything down. HTML was a step backward into the dark ages so far as program modularity and security was concerned, but it was simple and democratic and opened up the closed, esoteric world of the programmer to everybody else. If you'd had to code all websites in C++ then the Web would be about one millionth the size it is today. I could only applaud this democratic revolution and renounce my nerdish elitism, perhaps for ever. Progress did continue in an anaemic sort of way with Java, C#, plus interpreted scripting languages like Python and Ruby that modernised the expressive power of Lisp (though neither ever acquired a tolerable graphics interface).

But over the last few years something wonderful has happened, a rebirth of true nerdhood sparked by genuine practical need. Web programming was originally concerned with page layout (HTML, Javascript) and serving (CGI, Perl, ASP,.NET, ColdFusion), then evolved toward interactivity (Flash, Ajax) and dynamic content from databases (MySQL, PostgreSQL, Drupal, Joomla). This evolution spanned some fifteen-odd years, the same years that new giant corporations like Google, eBay and Amazon began to face data processing problems never encountered before because of their sheer scale. Where once the supercomputer symbolised the bleeding edge of computing - thousands of parallel processors on a super-fast bus racing for the Teraflop trophy - nowadays the problem has become to deploy hundreds of thousands of processors, not close-coupled but scattered around the planet, to retrieve terabytes of distributed data fast enough to satisfy millions of interactive customers. A new class of programming languages is needed, designed to control mind-bogglingly huge networks of distributed processors in efficient and fault-tolerant fashion, and that need has spawned some profoundly nerdish research.

Languages like Erlang and Scala have resurrected the declarative programming style pioneered by Lisp and Haskell, which sidesteps many programming errors by deprecating variable assignment. Google has been working on the Go language to control its own huge processor farms. Designed by father-of-Unix Ken Thompson, Rob Pike and Java-machine specialist Robert Griesemer, Go strips away the bloat that's accumulated around object-oriented languages: it employs strong static typing and compiles to native code, so is extremely efficient for system programming. Go has garbage collection for security against memory leaks but its star feature is concurrency via self-synchronising "channels", derived (like Occam's) from Tony Hoare's seminal work on CSP (Communicating Sequential Processes). Channels are pipes that pass data between lightweight concurrent processes called "goroutines", and because they're first-class objects you can send channels through channels, enabling huge network programs to reconfigure their topology on the fly - for example to route around a failed processor or to balance a sudden load spike. Google has declared Go open-source and recently released a new version of its App Engine that offers a Go runtime in addition to supporting Java and Python. I had sworn that Ruby would be my final conquest, but an itching under my crown signals the approach of a nerd attack...

[Dick Pountain decrees that all readers must wear white gloves and shades while handling his column, and must depart walking backwards upon finishing it.]

No comments:

Post a Comment

CHINA SYNDROME

Dick Pountain /Idealog 357/ 08 Apr 2024 01:09 Unless you live permanently as an avatar in Second Life [does that even still exist?] then it ...