When Computers Go Wrong 250
Barence writes "PC Pro's Stewart Mitchell has charted the world's ten most calamitous computer cock-ups. They include the Russians' stealing software that resulted in their gas pipeline exploding, the Mars Orbiter that went missing because the programmers got their imperial and metric measurements mixed up, the Soviet early-warning system that confused the sun for a missile and almost triggered World War III, plus the Windows anti-piracy measure that resulted in millions of legitimate customers being branded software thieves."
Comment removed (Score:5, Insightful)
Wow ! (Score:2, Insightful)
I can't imagine the well known and documented story of U.S. exploding the gas pipeline could be put in such a backward way.
Next in news: U.S. thoughtful placement of Manhattan skyscrapers dealt a heavy blow to international terrorism, two terrorist planes down.
K.L.M.
Re:Computers do what they are told to (Score:5, Insightful)
Another aspect to this is a common property of most "digital" computations. I've seen it expressed as "Digital errors have no order of magnitude". Another phrasing is "Getting one bit wrong is generally indistinguishable from randomizing all of memory". So when a digital calculation goes wrong, a tiny, inconsequential error is just about as likely as a total meltdown of the entire system.
Programmers tend to get familiar with this phenomenon very early in their career. They write a small chunk of code that does a simple calculation, and the result is orders of magnitude wrong. When they investigate, they discover it was caused by a one-character typo, perhaps an "off by one" error such as using '<' instead of '<=', or vice-versa. This quickly leads to what many "normal" people consider the major character failure of software geeks, the insistence that everything be exactly right, no matter what, and the willingness to spend long hours discussing insignificant minutiae as if they mattered. In their work, it's usually such insignificant minutiae that brings the whole house of cards tumbling down.
If you're unwilling to take the difference between a comma and a simicolon seriously, you have no future as a software developer. This is often why something goes badly wrong and we have events like those described in this story.
OTOH, it is interesting that, despite all the software disasters like the metric/imperial-units story, the software world has never insisted that programming languages include units as part of variables' values. It's not like this is anything difficult, and it has been done in a number of languages. But none of the common languages have such a feature. It is a bit bizarre that we can get into long discussions of complex, obscure concepts such as type checking or class inheritance, when our calculations are all susceptible to unchecked unit mismatches (without even a warning from the compiler or interpreter). There's a lot of poor logic when the topic is the relative importance of various sources of bogus calculations.
Re:Imperial - Metric (Score:5, Insightful)
From your post it sounds like you've been living somewhere that used to belong to the british empire, those people still tend to think of their weight in "stones" and various other oddball measurements but there are definitely countries where imperial units are barely used.
Here in Sweden the only people who use imperial units seem to be carpenters who call a 5x10 cm piece of wood a "tvåtumfyra" ("twoinchfour") but even they don't actually assume the actual size of it is 5.08x10.16 cm, it's just that "tvåtumfyra" is faster to say than "fem gånger tio centimeter".
As for degrees, most people tend to use degrees in everyday conversation (when it comes up) but degrees are not an "imperial" measurement, it predates most imperial units by centuries. And most people I've met who have taken "advanced" high school level math or college level math tend to use radians when actually doing any kind of math related to angles.
Also, you tell someone here in scandinavia that you're 5'10" tall and weigh 176 lbs and they're likely to either not understand you or they'll go "So, a foot is like, 30 cm, right? and how many inches are there in a foot? I know it's not ten but like, fifteen or something, right? And a pound's like, 0.5 kg? or was it less? maybe more? And aren't there two types of pound? Or was that pints?".
Basically, if you tell someone around here that something is "n <imperial unit>" they will have no clue no matter how "natural" you think it is because you happened to grow up with it.
Also, as for easy unit conversions, people do use them, just not in the uncommon ways you described, most people just aren't familiar with some of the less common prefixes but milli-, centi-, deci-, hecto- and kilo are all commonly used (and most people know that mega and giga are millions and billions, they just don't have much use for them, so rather than saying 1.5 megameters you say 1500 kilometers).
"Black day for power programmers" Windows virus (Score:5, Insightful)
The second comment I have on this is about missing the LAX Communications system software crash which caused multiple near misses on the tarmac and in the air when air traffic controllers could not communicate with pilots because of the crash. The cause of the software crash was a UNIX system was replaced with a Windows based system which had a known flaw. The flaw was that the OS could not run for more than 39 days no matter what was running on it. The system and software was still approved and put inplace with a maintenance instruction of rebooting the computer every 30 days. In comes a new employee who sees things are working fine so he/she doesn't reboot the computer and 9 days later the system crashes. The backup does the same and both are unable to recover and it takes hours to get the system back running again. That should have been in the list IMO.
There was also the CSX Railway situation when lots of its signals go offline because they are run by Windows and their Windows computers got a virus.
It would be nice to see a more complete and more accurate list of these kinds of computer software failures.
LoB
Re:It's a simple rule (Score:5, Insightful)
Re:Ariane 5 missing on the list (Score:5, Insightful)
Actually, those kind of conversions should be banned from any managed programming environment. It's fine that you need to work with bytes, shorts etc. or heck maybe even machine words, but lets only do that when absolutely required, shall we.
It amazes me that the many programming languages still don't define acceptable ranges, accept null pointers, and use round robin two-complement numbers etc. etc.. It's just asking for errors just like these. Sure they have their uses for lower level functions, but I would certainly like to have something better for API's and general use business logic. They are just another pointer arithmetic or GOTO waiting to be erased from mainstream programming (and for sure, in many newer languages, they indeed are).
Re:Not always (Score:1, Insightful)
No, you obviously didn't, as you show again. The point is that it's as much a truism as that it is a truism that the failure of anything man-made can ultimately be explained as failure of a human. That's not a reason not to call it a computer error, just as you'd not replace the term human error by physics error just because the human behaviour is ultimately the behaviour of a physical system following the laws of physics.
Re:The best parts of the article were... (Score:5, Insightful)