Thursday 9 June 2016

COMMAND AND CONTROL

Dick Pountain/ Idealog 257/ 04 December 2015 09:34

During those awful last weeks of November 2015, with the bombing of a Russian passenger jet, the Paris shootings and the acrimonious debate over UK airstrikes in Syria, it was very easy to overlook a small but important news story, an Indonesian report into the loss of AirAsia flight QZ8501. In December 2014 that plane crashed into the Java Sea with loss of all 162 passengers and crew, and recovered "black box" data has enabled investigators to come to a firm, but disturbing, conclusion about the cause. It was what you might call software-assisted human error.

A broken solder joint in the Airbus 320-200's rudder travel limit sensor (probably frost damage) sent a series of error signals that caused the autopilot to turn itself off and forced the crew to take back manual control. Flustered by this series of messages the flight crew made a fatal error: the captain ordered the co-pilot, who had the controls, to "pull down", intending to reduce altitude. It was an ambiguous command which the co-pilot misinterpreted by *pulling* back on the stick, sending QZ8501 soaring up to its maximum height of 38,000 feet followed by a fatally irretrievable stall. The report recommends that international aviation authorities issue a new terminology guide to regularise commands in such emergencies, but I reckon this was a problem of more than just the words. Like all modern jets the Airbus 320 is fly-by-wire, with only electronic links from stick to control surfaces, and in current designs there's little  mechanical feedback through the stick ("haptic" feedback is planned for the next generation, due from 2017). I'd guess that co-pilot received few cues to the enormity of his error by way of his hand on the stick. It's not enough to *be* in control, you have to *feel* in control.

I've been intrigued by the psychology of man-machine interaction ever since I saw ergonomic research by IBM, back in the '80s, that showed that any computer process which takes longer than four seconds to complete without visual feedback makes a user fear that it's broken. That insight lead to all the progress-bars, hour-glasses and spinning hoops we've become so tediously familiar with since. An obscure and controversial branch of psychology called Perceptual Control Theory (PCT) can explain such phenonomena, by contesting the principal dogma of modern psychology, namely that we control our behaviour by responding directly to external stimuli (fundamental to both old-style Behaviourism and newer Cognitive versions). PCT says we don't directly control our behaviour at all, we modify our *perceptions* of external stimuli through subconcious negative feedback loops that then indirectly modify our behaviour.

A classic example might be riding a bike: you don't estimate the visual angle of the frame from vertical and then adjust your posture to fit, you minimise your feeling of falling by continuously and unconsciously adjusting posture. Similar mechanisms apply to all kinds of actions and learning processes and I was easy to convince because from childhood I've always hated skating (roller, ice, skiing), but I still love riding motorbikes at speed. There's no contradiction: when skating my feet feel out of control, whereas when biking they don't. Just some quirk of my inner ear. However this all has some fairly important consequences for current debates about robotics, and driverless cars in particular.  

The recent spate of celebrity doom-warnings about AI and robot domination are all directed against current assumptions that fully autonomous machines and vehicles are both desirable and inevitable. But maybe they're neither? The sad fate of AirAsia QZ8501 suggests both that over-reliance on the autopilot is severely reducing the ability of human crews to respond to emergencies, and also that it would be good to simulate the sort of mechanical feedback that pilots of old received through the stick, so they instinctively feel when they're steering into danger. All autonomous machines should be fitted, by law, with full manual-override that permits actual (or, grudgingly, simulated) mechanical control. Boosters of driverless cars will retort that computers react far faster than humans and can be programmed to drive more responsibly, which is quite true until they go wrong, which they will. Perhaps we need at last to augment Isaac Asimov's three laws of robotics:

1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
4. A robot must if instructed drop dead, even where such an order conflicts with the First, Second, and Third Laws.

[Dick Pountain believes that any application vendor whose progress-bar sticks at 99% for more than four seconds should suffer videoed beheading]

No comments:

Post a Comment

SOCIAL UNEASE

Dick Pountain /Idealog 350/ 07 Sep 2023 10:58 Ten years ago this column might have listed a handful of online apps that assist my everyday...