Dick Pountain/Idealog 253/06 August 2015 14:58
Perhaps you were as disturbed as I was by that report, back in May, that a US hacker travelling on a Boeing airliner claimed to have penetrate its flight control system via the entertainment system's Wi-Fi, and made the plane climb and turn from his laptop. Aviation experts have since rubbished his claim (but then Mandy Rice Davies would definitely apply). It did however concentrate my mind, in a most unwelcome fashion, on the fact that all the planes we fly in nowadays employ fly-by-wire under software-control, and that my confidence in software engineers falls some way short of my confidence in mechanical engineers.
This nagging anxiety was rubbed in further by the fatal crash of a military Airbus A400M in July, after its engine control software shut down three of its four engines just after take-off. It appears that accidental erasure of some config files during installation had deprived the software of certain torque calibration parameters that it needed to monitor engine power. These (literally) vital numbers were being loaded from an external file, so in other words the safety of this aircraft was being governed by a programming practice on a par with installing Windows updates. Nice.
To me safety-critical software is about more than just fear for my own ass: it's been of concern for many years. I started out as a Forth programmer, at a time when that language was widely used in embedded control systems, and attended conferences on the subject of safety in both software and hardware architecture. Then I graduated, via Pascal and Modula-2, to becoming an admirer of Nikolaus Wirth's ideas on good programming practice, and finally on to object-oriented programming as the way to wrest control over the structure of really large programs. Object-orientation is now of course the rule, supported by every language from Javascript to Scratch, but I sometimes wonder whether it still means anything very much, or has it become a mere style to which lip-service is paid. Loading critical data from unreliable external files violates the principles of encapsulation in more ways than I can count.
I did a bit of Googling and found lots of papers about safety-critical architectures and redundant hardware systems. Redundancy is a key safety concept: you build three separate computers, with CPUs from different manufacturers running different software written by different teams - which have been demonstrated to produce the same outputs from the same inputs - then go with the majority verdict, the idea being that the *same* software or hardware bug is very, very unlikely to arise in all three. Interestingly enough, the latest of these papers seemed to be dated around 2008.
Surely it can't be that, as in so many other spheres (like, say, banking) the optimists have taken over the farm and started trusting too much? Then I stumbled across NASA's 10 rules for developing safety critical code. Now NASA tends to work with computer hardware that's several decades behind state-of-the-art but - give or take a Hubble or two - it's had fairly few disasters that were down to software. Here are its rules, severely abbreviated by me:
1: All code to have simple control flow constructs: no GOTO, direct or indirect recursion.
2: All loops to have a fixed upper bound, which can be statically proved never to be exceeded.
3: No dynamic memory allocation after initialization.
4: No function longer than 60 lines of code.
5: The assertion density of the code to average a minimum of two assertions per function.
6: Data objects must be declared at the smallest possible level of scope.
7: Calling functions must check non-void function return values and the validity of all parameters.
8: Preprocessor to be restricted to headers and simple macros. Token pasting, variable argument lists (ellipses) and recursive macro calls all forbidden.
9: Pointers to be restricted to one level of dereferencing, which mustn't be hidden inside macros or typedefs. Function pointers forbidden.
10: All code must be compiled with all warnings enabled at their most pedantic setting. All code must compile at these settings without any warnings. All code to be checked daily with at least one — preferably several — state-of-the-art static source code analyzer, and pass with zero warnings.
These rules are, realistically enough, aimed at plain old C programmers, not at trendy new languages, but they impose a degree of rigour comparable to most object-oriented languages. Their recommended heavy use of Assertions is interesting. Assertions are supported directly in Eiffel, Ada and some other languages, and can be added to C via the header "assert.h". They can specify the desired value range of some variable at some point in program execution and raise a runtime error when not met: an example might be "assert( TorqueCalibrationParameter > 0)".
My columns for PC Pro magazine, posted here six months in arrears for copyright reasons
Subscribe to:
Post Comments (Atom)
FULL CIRCLE
Dick Pountain /Idealog 360/ 07 Jul 2024 11:12 Astute readers <aside class=”smarm”> which of course to me means all of you </aside...
-
Dick Pountain/Idealog 262/05 May 2016 11:48 The ability to explain algorithms has always been a badge of nerdhood, the sort of trick peopl...
-
Dick Pountain /Idealog 340/ 10 Nov 2022 09:53 I live in Camden Town, close to The Regent’s Canal down which I can walk in 10 minutes to King...
-
Dick Pountain /Idealog 344/ 05 Mar 2023 02:51 I was born and schooled among the coalfields of North East Derbyshire, but I no longer have mu...
No comments:
Post a Comment