Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Programming IT Technology

Richard Feynman, the Challenger, and Engineering 217

An anonymous reader writes "When Richard Feynman investigated the Challenger disaster as a member of the Rogers Commission, he issued a scathing report containing brilliant, insightful commentary on the nature of engineering. This short essay relates Feynman's commentary to modern software development."
This discussion has been archived. No new comments can be posted.

Richard Feynman, the Challenger, and Engineering

Comments Filter:
  • by eldavojohn ( 898314 ) * <eldavojohn@gma[ ]com ['il.' in gap]> on Wednesday February 20, 2008 @12:30PM (#22489314) Journal
    I'm a software developer. I would like to think of myself as an engineer but to me that's a higher title that belongs to people who actually engineer original ideas.

    The problem with the shuttle disaster (both of them, really) is external pressures that are not in anyway at all scientific. The pressure from your manager at Morton Thiokol to perform better, faster and cheaper. The pressure from the government to beat those damned ruskies into space at all costs.

    So this is really a case of engineering ethics, when do you push back? As a software developer, I never push back. Me: "There's a bug that happens once every 1,000 uses of this web survey but it would take me a week to pin it down and fix it." My Boss: "Screw it--the user will blame that on the intarweb, just keep moving forward." But could I consciously say the same thing about a shuttle with people's lives at stake? No, I could not.

    So when an engineer at Morton Thiokol said that they hadn't tested the O-Ring at that weather temperature that fateful day and that information was either not relayed or lost all the way up to the people at NASA who were about to launch--it wasn't a failure of engineering, it was a failure of ethics. External forces had mutated engineering into a liability, not an asset.

    And there's a whole slough of them [wikipedia.org] I studied in college:

    * Space Shuttle Columbia disaster (2003)
    * Space Shuttle Challenger disaster (1986)
    * Chernobyl disaster (1986)
    * Bhopal disaster (1984)
    * Kansas City Hyatt Regency walkway collapse (1981)
    * Love Canal (1980), Lois Gibbs
    * Three Mile Island accident (1979)
    * Citigroup Center (1978), William LeMessurier
    * Ford Pinto safety problems (1970s)
    * Minamata disease (1908-1973)
    * Chevrolet Corvair safety problems (1960s), Ralph Nader, and Unsafe at Any Speed
    * Boston molasses disaster (1919)
    * Quebec Bridge collapse (1907), Theodore Cooper
    * Johnstown Flood (1889), South Fork Fishing and Hunting Club
    * Tay Bridge Disaster (1879), Thomas Bouch, William Henry Barlow, and William Yolland
    * Ashtabula River Railroad Disaster (1876), Amasa Stone
    So I agree with Feynman's comments in relationship to engineering and the further comments to software development. But I don't find them to be a fault in the nature of engineering, just a fault in our ethics. What does capitalism and competitiveness drive us to do? Cut corners, often.
  • by Protonk ( 599901 ) on Wednesday February 20, 2008 @12:44PM (#22489538) Homepage
    To be fair, the Challenger disaster actually preceeded NASA's slogan and procurement policy of "faster, better, cheaper" by a bit. More to the point, Feynman's article should be a cautionary tale to ANYONE in a engineering field. It isn't a matter of one field being subject to unscientific pressures and another field being immune. No technology or industry is immune from the pressures and problems that caused the challenger disaster. Anyone who claims to be well adapted to safety concerns enough to not spend lots of time and effort on fixing them is foolish. The nuclear industry still has to practice strong QC on parts, procedures and maintenance and CONTINUE that practice. Same with commercial aviation, acute medical care, etc. Constant vigilance is rewarded only with another uneventful day. That is the fundamental problem. Vigilance is expensive and time consuming. these are not pressures from the profit motive. They apply to government as well as civilian ventures.
  • by Vicious Penguin ( 168888 ) on Wednesday February 20, 2008 @12:48PM (#22489604)
    > What does capitalism and competitiveness drive us to do? Cut corners, often.

    Maybe, but remember what your own example shows -> What is the cost/benefit of fixing/preventing an error? Is a week of debug time worth missing your target ship date? Maybe, maybe not - depends on the error.

    A blanket indictment of capitalism is quite unfair. You would still have the same cost/benefit analysis regardless of economic system you toiled under.

    Is is not possible to engineer against all eventualities; trying to do so will usually keep you from ever getting off the ground.
  • Hm. (Score:5, Insightful)

    by gardyloo ( 512791 ) on Wednesday February 20, 2008 @12:51PM (#22489630)
    The blog post makes a nice contribution by linking to Feynman's original thoughts (for example, here: http://www.ranum.com/security/computer_security/editorials/dumb/feynman.html [ranum.com] ), ones I haven't read for a long time (and was happy to be reminded of). However, the author makes the mistake of thinking that the original thoughts need to be interpreted and summarized for the reader. Feynman's words by themselves are simple to understand, are concise, and contain just the tone for which geeks go gaga. Anyone interested in the subject will be able to make his or her own judgements about the engineering and politics involved in the Shuttle development, engineering in general, and the extensions to software development.
  • Re:Hm. (Score:3, Insightful)

    by Protonk ( 599901 ) on Wednesday February 20, 2008 @12:55PM (#22489706) Homepage
    This is a very good point. Feynman has the unique quality of startling intelligence, curiosity, and straightforwardness. Some authors need to be summarized. Feynman just needs to be trotted out every generation or so.
  • by Protonk ( 599901 ) on Wednesday February 20, 2008 @01:01PM (#22489800) Homepage
    This is true to an extent, but safety concerns can and should be engineered for. You are absolutely right that there exists no direct corollary between software debugging for some non-critical application and meeting safety margins for a critical product. However some software IS critical. Flight software (This portion of Feynman's essay about NASA's flight software is amazing), software for hosptial applications (pharmacy, PCA's, microsurgery), ABS/suspension control software. Those are applications with VERY critical outcomes. Safety conerns need to be built in to the process.

    But I do agree that tradeoffs occur under any system. Those tradeoffs just let us make better decisions under capitalism whereas we can't allow the information from those tradeoffs to inform us economically in a socialist system.

  • by Anonymous Coward on Wednesday February 20, 2008 @01:10PM (#22489962)

    But could I consciously say the same thing about a shuttle with people's lives at stake? No, I could not.


    Absolutely.

    Progress requires risk. The astronauts are aware of their risk, it's not a big deal.

    Yes, it would be nice to take 500 years to flawlessly engineer a tool, but in reality, you don't have that long. Engineering is sometimes about making educated guesses in order to build something in a reasonable period of time, and you learn something from your errors. Obviously more commercial tools require more margin than more complicated ones - you expect more risk going to the moon than going to 7-11. (Oddly, there's probably more risk per mile going to 7-11, so the space engineers are doing a pretty decent job.)

    The problem with the shuttles wasn't poor engineering - it was that when someone spotted an issue, that it was squelched. This is a social/management problem, not an engineering problem.
  • by Martin Spamer ( 244245 ) on Wednesday February 20, 2008 @01:18PM (#22490080) Homepage Journal
    The biggest problem is most software developers are NOT chartered professional software engineers, so have no personal, professional and legal responsibility for their work. That is why IT is full of cowboys and trust is nearly none existent. Software Engineers must become a chartered only profession, so that people who are not chartered are not allowed to practice.

    To qualify as a Professional Engineer we should place good practice above short term gains. Professional Engineers should be truthful and objective and have no tolerance for deception or corruption. Professional Engineers only work in areas were they are competant. Professional Engineers build their reputation on merit and their skills through continual learning and the skills of their charges through ongoing mentoring.

    We wouldn't have to put up with the shoddy work of cowboys, because they wouldn't be allowed to practice. We wouldn't have to put up with orders that counteract professional ethics or good practice, because legal responsibility trumps commercial pressures. The professional wouldn't be undermined by fast to market but poor quality work. We could place trust in third party tools, software & services and we would not have to put up with EULA that diavowed responsibility for damage.
  • by somersault ( 912633 ) on Wednesday February 20, 2008 @01:31PM (#22490280) Homepage Journal

    I'm a software developer. I would like to think of myself as an engineer but to me that's a higher title that belongs to people who actually engineer original ideas.
    Well I know I'm missing the point of your post with this, but a quick google comes up with this description of an engineer:

    a person who uses scientific knowledge to solve practical problems
    I think your higher title should be an 'inventor'. Engineers are the guys that generally plod away using well tested mechanical or other scientific knowledge to get everyday jobs done (just like a software engineer really?). I work as IT support/coder for a bunch of engineers here and while they sometimes may be using old ideas in new ways, most of their work is just that plodding away using tried and tested methods, with occasional refinements in stuff like blade geometry for example (we do a lot of work with turbines). I think there is actually a lot more invention done in the field of software development than in engineering. Having said that, I studied 'Computer Science' and not 'Software Engineering', so apparently I'm a scientist :p
  • Comment removed (Score:3, Insightful)

    by account_deleted ( 4530225 ) on Wednesday February 20, 2008 @01:45PM (#22490540)
    Comment removed based on user account deletion
  • by Anonymous Coward on Wednesday February 20, 2008 @01:50PM (#22490642)
    Do you stamp/sign every line of code you let go? Are you personally liable for anything you approve with your sign/stamp? If not, don't use the term professional engineer. If a "software engineer" screws up and is fire, they go somewhere else to work. If an engineer scews up and is fired, they most likely won't be able to find somewhere else to work in their field.
  • Blaming the shuttle disaster on capitalism is erroneous. I do not necessarily disagree with your assessment in general, but capitalism was not at fault in that particular instance. What was at fault was bureaucrats trying to look good to their superiors and present a positive public image at the cost of real engineering.

    I would say that in general is the meta-problem, not capitalism. In its current form in the US capitalism has caused the existence of many large entities that use hierarchical systems of command and control. These hierarchical systems frequently make sub-optimal decisions because individual actors within the system act for their own benefit but against the benefit of the larger system they are a part of. Particularly egregious examples of this can be found, and they tend to be highlighted as aberrations, but they aren't. They are merely extrema of a problem that is widespread.

    Bureaucracy in general serves to insulate actors from responsibility for the results of their actions. As I recall we didn't see any of the middle management of NASA held accountable for the disaster they caused by attempting to look good for their superiors and the public. And this failure of accountability is endemic to the kinds of hierarchical systems you see in most bureaucracies.

Today is a good day for information-gathering. Read someone else's mail file.

Working...