Saturday 12 October 2019

IT’S COMPLICATED


Dick Pountain/ Idealog 296/ 4th March 2019 13:41:22

I’ve often professed here my liking for simplicity, particularly in the design of software, but more generally in the design of objects. I’m just a Bauhaus sort of guy. That doesn’t mean that I hate, or fear, complexity. On the contrary, I know the universe is complex and I consider complexity theory an important area of mathematical philosophy, though one neither widely nor well understood.

It’s important for us in the IT business because, used properly, it can save us wasting effort trying to solve some problem with an algorithm that doesn’t stand a chance. But it’s also philosophically important because it places bounds on what it’s possible to know, and can rule out certain kinds of nonsense a priori. The thing is, complexity theory is quite hard both to grasp and to explain, involving as it does the concepts of true randomness and infinity. I know this because I’ve recently tried to explain it, and found it really hard work.

One of my simple pleasures is walking on Hampstead Heath, occasionally accompanied by an old friend, a semi-retired professor in a social science. We typically end up in deep analytical chats, over pints, about the state of the world, and recently he asked out of the blue for an explanation of complexity theory. His colleagues and students had begun using ‘complexity’ as a buzzword in their publications, in much the way chaos theory was thrown around a few decades ago, and he suspected this might be poorly-digested bullshit.

I recommended a couple of books, but back home I looked and found the first hopelessly out-of-date, while the other I barely understood myself, though I thought I did 30 years ago. So I set about trying afresh. One mathematical kind of complexity, studied as algorithmic information theory, is about the resources required for the execution of algorithms. This was pioneered in the 1960s in crucial papers by Andrey Kolmogorov and Gregory Chaitin, and at its heart is the definition of randomness. You can’t predict exactly the next character in a random string like “asdwebqwgastytinfdebfbwwvefwramk”, so there’s no description shorter than quoting the whole string. On the other hand a string like “okokokokokokokokokokokokokokok” has a shorter definition, namely “repeat ‘ok’ 15 times”. Such shorter definitions are algorithms, but this isn’t really what my friend was looking for.

A less abstract approach to complexity might be physical, via cause and effect. If I let go of this glass vase it will fall on the hard kitchen floor and shatter, thanks to gravity, a fairly simple cause. I like the vase though, so a second causal chain might start in the neural circuits that make up my brain, forming the intention “I’m going to repair it”. This sends messages to my limbs to sweep up the bits and deposit them on the kitchen table. Lots of them, small, more-or-less triangular shards. Can I in principle fit them all back together like a jigsaw and super-glue them? I don’t know the answer, and it might in fact be unknowable, because the number of comparisons that need to be made, and the time they would consume, grows exponentially with the number of shards.

Or here’s another stab. You’re given a string of bits and told to guess whether the next bit will be a 0 or a 1. Let’s distinguish three different scenarios:

a) The string might consist of all 0s, so that after a few hundred you suspect the next will be 0 too. This scenario might arise because some electronic device generating the bits is broken or turned offb) The string may be truly random, generated by repeated tosses of a fair coin, so however many bits you examine the chances of guessing the next correctly remain exactly 50:50.
c) The string might represent something, for example the letter “A” in this digital font. The distribution of 1s is no longer random: long stretches of all 1s correspond to the black parts, stretches of all 0s correspond to white space between adjacent characters.

One hundred bits of each string contain the same amount of Shannon information, but for everyday purposes a) and b) are less informative than c). String a) typifies nothingness, brokenness, non-existence. String b) typifies dissolution or decomposition. You could see both as different ways to represent formlessness or death. The most interesting things in the universe, living beings, can be represented and reproduced by type c) strings, stretches of repetition that convey form put there by the expenditure of energy to temporarily and locally lower their entropy. Complexity arises when a living creature, a bundle of proteins, nucleotides, carbohydrates, uses its sensory organs to sample the outside world and try to figure out what to do next.

[Dick Pountain actually refuses to super-glue broken crockery in more than four pieces]


No comments:

Post a Comment

SOCIAL UNEASE

Dick Pountain /Idealog 350/ 07 Sep 2023 10:58 Ten years ago this column might have listed a handful of online apps that assist my everyday...