Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×

Computers Key To Air France Crash 911

Michael_Curator writes "It's no secret that commercial airplanes are heavily computerized, but as the mystery of Air France Flight 447 unfolds, we need to come to grips with the fact that in many cases, airline pilots' hands are tied when it comes to responding effectively to an emergency situation. Boeing planes allow pilots to take over from computers during emergency situations, Airbus planes do not. It's not a design flaw — it's a philosophical divide. It's essentially a question of what do you trust most: a human being's ingenuity or a computer's infinitely faster access and reaction to information. It's not surprising that an American company errs on the side of individual freedom while a European company is more inclined to favor an approach that relies on systems. As passengers, we should have the right to ask whether we're putting our lives in the hands of a computer rather than the battle-tested pilot sitting up front, and we should have right to deplane if we don't like the answer."
This discussion has been archived. No new comments can be posted.

Computers Key To Air France Crash

Comments Filter:
  • by toby ( 759 ) * on Monday June 08, 2009 @10:06PM (#28260143) Homepage Journal

    It's not surprising that an American company errs on the side of individual freedom while a European company is more inclined to favor an approach that relies on systems.

    How fond Americans are of reductionist dualities that are unhelpful, misleading and frequently downright dangerous: American pilot with The Right Stuff in an American plane would have saved everyone; dangerous European plane and computer killed hundreds. Oversimplified sniping, or childish fantasy?

    If I want real facts on flying, instead of wild-assed pseudo-political trollery, I'll go read Peter Ladkin or Patrick Smith [salon.com]: "The gist of the accident appears pretty clear: Air France Flight 447 was victimized by a terrible storm."

  • by Ethanol-fueled ( 1125189 ) on Monday June 08, 2009 @10:08PM (#28260167) Homepage Journal
    When I read TFA I had a knee-jerk reaction to hate on Airbus, as I believe that everything should have a manual override.

    Then I thought of Terrain-following radar [wikipedia.org] and realized that things are not always that simple. Quote:

    Under these conditions terrain-following radar is a necessity, since a human pilot cannot react quickly enough to changing terrain heights, and is much more likely to cause a crash than an automated system in the same circumstances.

  • What a dumb phrase. Do you only want former airforce pilots who've actually seen combat flying commercial planes? How exactly is that going to keep you up in the air in a civilian airliner experiencing an electronic or mechanical malfunction?

    And if what you really mean is experienced pilots, what about some pilot who's been flying for years and has never had an emergency situation and then makes a mistake and then (s)he makes a judgement error in a critical situation? Are you then going to call for the iron calm of a computer rather than a fallible human pilot?

    No, the answer is statistics. What's safer and more reliable in the long run? How many crashes have we had due to computer error rather than human error given x hours flown by each?

    The very wording of this ridiculous post presupposes an answer. And in the future it is very likely the wrong answer. Sure computers will make errors. But in general people will make them more often, and computers are just going to get better.

    And casting this as some kind of bizarre collectivist vs. individualist ideology debate is ridiculous as well. What does towing some ideological line have to do with safely getting to your destination in an airplane?

    This Slashdot article is full of simplistic drivel designed to provoke ideologically based knee-jerk responses instead of any kind of reasoned debate.

    The linked to text is much, much better, even though offering people a choice is problematic given how the whole non-refundable ticket system and airline logistics systems currently work, not to mention that making a choice at the gate when you get on the plane will throw off your schedule.

  • by Adrian Lopez ( 2615 ) on Monday June 08, 2009 @10:13PM (#28260213) Homepage

    It's not surprising that an American company errs on the side of individual freedom while a European company is more inclined to favor an approach that relies on systems. As passengers, we should have the right to ask whether we're putting our lives in the hands of a computer rather than the battle-tested pilot sitting up front, and we should have right to deplane if we don't like the answer.

    Lemme' guess... you're an American.

  • Pick your poison (Score:5, Insightful)

    by Anonymous Coward on Monday June 08, 2009 @10:15PM (#28260243)

    The Continental flight that crashed in Buffalo on the 12th of February crashed because the inexperienced pilot pulled up when the plane stalled. A computer controlled system might have nosed down to get airspeed and saved 50 lives. Of course I doubt a computer controlled system would be able to make a flawless landing in the Hudson.

  • by W.Mandamus ( 536033 ) on Monday June 08, 2009 @10:15PM (#28260247)
    Well it's quite simple really. Boeing doesn't expect anybody to be flying one of their big jets without years of experience. If you have a mechanical failure do you really want to have a machine, that may be getting fed bad data, trying to figure out what to do next. (Also doesn't help airbus that they seem to be having many more crashes then Boeing over the last five years).
  • Experience (Score:3, Insightful)

    by Kell Bengal ( 711123 ) on Monday June 08, 2009 @10:17PM (#28260283)
    I trust an engineer's years or study and careful planning over a pilot's hastily considered last-second decisions. It's not that I don't trust the pilots, it's just that an engineer has had more time to put together a solution and implement it in the computer. They know the limits of their craft intimately and I trust them to know how to keep them in the air.
  • by HangingChad ( 677530 ) on Monday June 08, 2009 @10:19PM (#28260307) Homepage

    It's not surprising that an American company errs on the side of individual freedom...

    Eh? You mean the freedom to work under-paid pilots 14-16 hours a day like Colgan Air? And the FAA let them slide because Colgan had friends in that office? Some of their pilots could make more flipping burgers. Like the pair that were tired, under-paid and not paying attention who turned Continential flight 3407 into a giant lawn dart.

    This isn't political. I don't care if it's human, machine or a trained goat. Whatever gets the aircraft down in one piece is what I want managing the control surfaces.

  • So what? (Score:2, Insightful)

    by Anonymous Coward on Monday June 08, 2009 @10:19PM (#28260317)

    Boeing's [wikipedia.org] manual mode causes 100% fatality when the pitot tubes are blocked, too.

    What a shitty article considering that took me 30 seconds to research and wasn't mentioned.

  • Re:Experience (Score:1, Insightful)

    by Anonymous Coward on Monday June 08, 2009 @10:23PM (#28260375)

    You cannot account for all the variables that might exist in a situation. No matter how how many scenarios you can dream up, there will be that one situation you haven't thought of. That's when you want an experienced pilot that can quickly adapt and react.

  • by sodul ( 833177 ) on Monday June 08, 2009 @10:25PM (#28260393) Homepage

    And remember that the recent plane crash in NY was caused by human error: the autopilot responded to the ice buildup by diving to maintain speed, the pilot 'corrected' what he though was an error and the plane fell to the ground like a stone.

    The truth is, modern computers can be much much better pilots than 95% of the pilots out there. I don't think the autopilot would have even attempted the landing in the Hudson river, here the pilot was clearly one of the top pilots that I want on every single I fly. Also I'm pretty sure that good pilot was not overworked and was well rested before his flight. Whatever good training you have humans will always make mistakes and they get worse with fatigue. The computer does not get tired, or emotional.

    So with an average pilot, I think the autopilot is much more trustable. In case of exceptional emergency, a true outstanding pilot might pull it off where the computer will not. I'm not sure the data (if it exists) favor the humans though.

  • by sounddude ( 60624 ) on Monday June 08, 2009 @10:25PM (#28260405) Homepage

    ummmm Flight 1549 was an Airbus 320.

  • I would argue that instead of it being one or the other, it would be better if the inputs could be merged. Humans are generally better at ingenuity (unless the herustics are really good) and computers are generally better at speed of reaction (unless there's a deadlock between threads), but there's no universal rule.

    What's really needed is a way for the pilot and the computer to cooperatively function, such that the failure of either at a task is not a catastrophic failure that could destroy the aircraft.

    (I can just hear Boeing and Airbus chiming in: "Yeah, yeah, socialists and their cooperatives! Gimmie a good, old-fashioned dictatorship!")

  • by Falconhell ( 1289630 ) on Monday June 08, 2009 @10:30PM (#28260459) Journal

    I doubt a computer controlled system would be able to make a flawless landing in the Hudson.

    Quite so, but your average pilot couldnt either.

    Sully was a very experienced glider pilot( Including a CFI instrutor rating, as was the captain of the Gimli glider.

    When the engines stop, just hope the pilot is experienced in flying without power

  • by Manip ( 656104 ) on Monday June 08, 2009 @10:31PM (#28260471)

    American Aircraft don't always have manual overrides, and EU (UK, German, French) aircraft often don't lack it. In fact Airbus is its own company and as such follows its own principles as far as design goes. Right now they're designing their aircraft to be as simple as possible and want to eliminate a lot of the human element.

    I don't agree with a lot of the discussions Airbus has made over the years:
      - Low strength materials in key areas
      - No warning alarm when auto-pilot is disengaged
      - Less manual control in case of system failure

    But then again Boeing has made some HUGE errors and has updated their 747 thousands of times to fix design flaws. People forget that not only is Boeing an older company but a lot of their aircraft designs are up to 40 years old and have been evolving constantly.

    American Vs. EU is complete bs but whatever helps Americans sleep at night.

  • by fermion ( 181285 ) on Monday June 08, 2009 @10:31PM (#28260473) Homepage Journal
    It seems to me that someone is trying to push their dogma through fear. I am not saying the computer did not cause the plane to crash, or that the pilot might have been able to do something to stop it if there was an option 'to have full control of the plane', whatever that means. What I am saying is that we really do not know all the circumstances, and it might be a bit early start pointing fingers.

    First, I would say it naive to think that computers are somehow at fault, and that they do not have a net benefit. The main reason to use digital solid state computers is that they often reduce discrete component count, which usually increases reliability. In a system that is supposed to nearly 100% reliability, like an aircraft, component count must be kept to a minimum. That has traditionally mean fly by wire, and the more fly by wire, the better. My understanding is that Airbus reduces complexity significantly assuming a complete fly by wire profile. One could, for instance, install backup hydraulics, which I assume is not done, but this would reduce reliability.

    There is not simple solution. Things do not increase security and reliability simply because we feel better. For instance, Many people feel safer in big trucks but many studies have shown that one is safer in a full size sedan. Likewise, one thing that makes a large truck, especially an SUV safe is the electronic stability control, which can countermand any driver instruction. Large planes are already computer controlled. Long haul flying of large planes is in no way a trivial task. I agree with the blog mentioned in the article that people who have no experience have no basis to make any useful comment.

  • by adzima ( 1315619 ) on Monday June 08, 2009 @10:31PM (#28260477)
    "But it's time the airline industry stopped treating passengers like children and began informing us of what airplanes we're flying on and how they're flown--and allowing us to decide how we're taking our lives in our hands." Really? These are complex systems with multiple levels of functionality and are difficult to understand. From the article, the author clearly lacks knowledge on the subject. Furthermore, I don't think the average person really wants to know how the plane works anymore than they want to know how CAN communication makes the EFI system in their car work by integrating ECU communication. As a consumer, I just want the car to start when I turn the key without it blowing up in my face.
  • by Behrooz ( 302401 ) on Monday June 08, 2009 @10:32PM (#28260493)

    If the Gimli Glider or Flight 1549 had been on an Airbus, there would have been a lot of dead people. When something goes wrong, Rule 1 is FLY THE FUCKING PLANE. Well, if the computers fail on an Airbus, good luck flying it!

    Flight 1549 was an Airbus A320 [wikipedia.org]. Don't fall for the FUD, any large passenger airliner is going to be designed to be as survivable as possible in the event of power loss. This whole article is just another example of irrational hysteria.

  • by abigor ( 540274 ) on Monday June 08, 2009 @10:33PM (#28260501)

    Bloggers need to say stupid shit like that in order to drive traffic via provocation. kdawson, you should be ashamed of yourself for posting this tripe.

  • by Anonymous Coward on Monday June 08, 2009 @10:36PM (#28260551)

    The problem is, the computer will not push the plane past it's design parameters, but anyone that understands basic engineering knows that the design parameters always err on the side of safety. The issue comes when you are in a scenario where your choices are:
    1. Push the plane past it's limits and hope it survives the process
    2. Crash for sure

  • Re:Experience (Score:3, Insightful)

    by d474 ( 695126 ) on Monday June 08, 2009 @10:41PM (#28260595)

    I trust an engineer's years or study and careful planning over a pilot's hastily considered last-second decisions. It's not that I don't trust the pilots, it's just that an engineer has had more time to put together a solution and implement it in the computer. They know the limits of their craft intimately and I trust them to know how to keep them in the air.

    That's all well and good, but engineers aren't gods. They can't anticipate everything, nor can they design systems that are full proof (AirFrance 447 case in point). And when their systems fail, the pilot should have the option of taking over control of the aircraft. To not provide that to the pilot is nothing short of hubris on the engineer's part, and people died because of it.

  • by jd ( 1658 ) <imipak@yahoGINSBERGo.com minus poet> on Monday June 08, 2009 @10:41PM (#28260601) Homepage Journal

    Boeing and Airbus have had roughly identical numbers of crashes in recent years. Boeing has had just a fraction more. If one method of flying was better than the other, there would be a difference, right? Since there is no measurable difference, it follows that the differences in a crisis balance out. What is good for one sort of crisis is a disaster in another.

  • by wasted ( 94866 ) on Monday June 08, 2009 @10:42PM (#28260617)

    What a dumb phrase. Do you only want former Air Force pilots who've actually seen combat flying commercial planes? How exactly is that going to keep you up in the air in a civilian airliner experiencing an electronic or mechanical malfunction?

    "Battle tested" may have been used in this context to refer to the long history of human pilots compared to the shorter history of using computers to control aircraft. If it refers to actual combat flight, flying military aircraft teaches one to expect something to break and know how to determine what is broke and what needs to be done to land safely. Military aircraft experience a lot more stresses than civilian aircraft, and thus tend to break more.

    And if what you really mean is experienced pilots, what about some pilot who's been flying for years and has never had an emergency situation and then makes a mistake and then (s)he makes a judgement error in a critical situation? Are you then going to call for the iron calm of a computer rather than a fallible human pilot?

    In a perfect world, the pilot would recognize a computer mis-evaluation if one occurs due to his simulator training, and over-ride the computer to land safely. In practice, this does not always occur - crashes have resulted from both non-overrides and incorrect overrides.

    No, the answer is statistics. What's safer and more reliable in the long run? How many crashes have we had due to computer error rather than human error given x hours flown by each?

    Although the computer may be statistically safer, if the pilot is able to over-ride obvious computer errors and is trained to recognize those errors, isn't that the best of both worlds?

    ...The linked to text is much, much better, even though offering people a choice is problematic given how the whole non-refundable ticket system and airline logistics systems currently work, not to mention that making a choice at the gate when you get on the plane will throw off your schedule.

    I always check which type of aircraft will be used on my flight prior to committing to purchase the ticket, and do not fly Airbus. I live near an airline hub, though, so it is easy for me to decide which aircraft to avoid. If a person's local airport has limited service, that choice may not be available to them.

  • The pilot must always have the option of manual override. *PERIOD*

    Well, that depends. Do the humans or the computers have a proven history of fucking up more often?

    Sure, the computer could malfunction. But how frequent is this compared to situations where it does something unexpected and the pilot thinks it's malfunctioning when it actually isn't, and that "something unexpected" is actually needed to keep the plane in the air?

  • by greyhueofdoubt ( 1159527 ) on Monday June 08, 2009 @10:51PM (#28260729) Homepage Journal

    I'm sorry, I didn't think TFS actually presented one side as better than the other. Maybe it's because I work in aerospace and take more things as given than you, but to me tfs raised a very interesting philosophical question. The summary even says that it's not a design flaw- it's a philosophical divide.

    I know several people who HATE flying because- even though they understand intellectually that flying is much safer than driving- the idea of falling to earth, out of control, with many seconds or even minutes to be aware of the terrible situation is much worse than feeling able to control their path in a car.

    I myself feel like I've outgrown that feeling. I've literally entrusted my life to other people so many times that when I get on a plane the idea of dying or crashing doesn't even cross my mind. It is a conscious flipping of a mental switch: I am not in control. This plane is being flown by someone who also does not want to die and that person knows what they're doing.

    And on the other hand I've read enough after-action briefs of computer glitches crashing planes that I don't entirely trust computers to fly. Yes, they have faster response times, yes they can look out for the airplane better than a human. Usually. Usually, they can do those things better than a human. Why WOULDN'T you allow a human into that chain of control? If the computer is going nose down into a mountain because of a frozen AOA probe, the pilot should just sit there and start praying? If the computer starts shutting down engines because of faulty fuel indicators, the pilot should just sit back and say, "Hey, we took off 45 minutes ago with 5000 gallons of fuel and barring an open fuel cap, there's no way we're actually out, but whatever, I'll accept a cold death in the north atlantic if it saves the engines from a potential flameout"?

    Here's where I sit: Computers should fly planes. Humans should solve problems. Computers are not perfect; if they were, we wouldn't need IT or pilots or astronauts or mathematicians. Someday when AI is improved and flight control computers absolutely do not cause stupid accidents, then I'll allow and empty cockpit.

    What I propose is a compromise, just like the american company, and it has nothing to do with john wayne or ayn rand or any other stupid emotionally-weighted crap.

    Hi, my name is ben, and it's my job to keep people from dying in airplanes, and I'm in favor of pilot intervention to avert crisis.

    Any typos were the computer's fault ;)

    -b

  • by M0b1u5 ( 569472 ) on Monday June 08, 2009 @10:53PM (#28260757) Homepage

    Generally airlines stopped hiring ex-military pilots as they tend to crash too often killing hundreds of people at a time.

    Military pilots find it hard to change from "Achieve objective; fly hard and kill bad guys" to "Land passengers safely at all costs" mentality.

    A huge oversimplification to say that US maker Boeing provides the freedom for pilots to fly. By the same token, you might well say that The US is the most over-regulated country on the planet, so why are pilots allowed to fly it with such freedom?

    I think that in general, you are arguably better off when the pilots are connected to the flight surfaces via manual controls. Even if the power and hydraulics go out with enough strength you may move some control surfaces a little - perhaps enough to control a plane in level flight - maybe even land it.

    But if FBW shits itself - you are TOAST.

    And for every crash caused by pilots not being able to take the control of a plane, there's probably another crash averted by the computer.

    The biggest problem of course is that flying a wide-bodied jet is 99.9999999% pure boredom followed by 0.0000000001% when you live or die because of a series of bad circumstances piling on top of each other.

    If the hardware fails for any reason (pilots get wrong information) then they can't expect to live for long - especially if the computers are flying it. At least if sensors start failing, humans are flexible enough to know something is wrong, and work around it.

    In general, I would prefer to be flying on a wide bodied jet that has the computer fly the entire flight, but with a pilot on board who is exceptionally good at looking at the computer non-stop to decide if it is working right. I expect that pilot to be so good, that he understands the point at which he needs to kick the auto-pilot into touch, and take control of the plane.

    See my signature. It's standard, not put here for this post.

  • by Achromatic1978 ( 916097 ) <robert@@@chromablue...net> on Monday June 08, 2009 @11:04PM (#28260861)
    What the hell airline has you flying as at least an FO with both Boeing and Airbus type certs with only 3,000 hours? It takes 1,500 hours to get an ATP alone, let alone with certs on multiple turbofan birds.

    I would say I want to avoid it, but it sounds like "Virtual Fantasy Airlines".

    Protips:

    • MSFS X is not an actual plane that you are flying.
    • The A330 that flies with propellers is a product only of Howard Hughes grandchildren's wildest imagination.
    • "near vertical dive"? No. Just because it feels that way to you as a passenger, don't make it so. A descent that exceeded 20 degrees would result in reports being filed with the FAA, at the least.

    Thanks for the laughs. Great way to end the day.

  • by dinther ( 738910 ) on Monday June 08, 2009 @11:10PM (#28260913) Homepage

    As an ex airline pilot and current software developer I would say that an override must be available in any system. Of course computers are much better in quick decision making and collecting all the facts than humans are. In fact with a glass cockpit, the computer knows the data before the pilot does anyway. But there is the occasion that software fucks up. Plain and simple.

    From my own personal experience:

    1 - Autopilot with suicide attempt

    Boeing 737-400 cruising at FL310 everything happy, clear skies. I'm Pilot flying and the captain suggest I have lunch. With the tray on my lap I eat while glancing at the instruments every once in a while. The captain was supposed to have control. So after a particular tasty piece of chicken I look up only to see the horizon at an angle and way too high. I glance across and see the captain reading the news paper. Look at the instruments which indicate a gentle diving turn. The VNav path on the displays indicate nothing out of the ordinary but this Autopilot decided to go for a turn and decent anyway. The whole thing would have only lasted a few seconds but there was absolutely no reason for the computer to do this manouvre. AP disconnect and reconnect sorted it all out.

    2 - Lazy plane

    Yeah, uh again during my mean and again I had handed control over to the captain while eating. This time at night. Cruising FL330 when auto throttle decides to close the throttles to idle. Auto pilot maintains altitude. WTF to I push the throttles back up. They stay up for a few seconds and yet again move to idle. Got rid of my food and disconnected the auto throttle. Set cruising power manually and checked everything. Nothing wrong. Re-engaged the auto throttle and things were fine.

    3 - Dutch roll gone bad

    Climbing through 10.000 feet on auto pilot, the plane begins a slight rocking left and right. No more than a few degrees. As we continue to climb the rocking gets worse. 5 deg bank either way. Auto pilot is working hard to compensate or so it seems because the control column moves noticeably. Again my luck to be pf. We thought the Autopilot had gone mad so after strapping ourselves in tightly we disconnected the ap. I tried to hand fly and stabilise but things got out of control rapidly as the plane started to buck left and right well past 10 degrees bank. I was obviously losing control. Nah, let's face it, I had no control and told the captain. He took over and at least was able to not allow it to get worse. Glad I was with this guy because he flicked off the yaw damper that is an automatic control system to stop an aerodynamic effect called Dutch roll. The plane steadied immediately although we were left with the Dutch roll effect but that was not too bad.

    So there you go. In all three cases it was not a matter of pilots being better than computers. Overrides are required when the computer goes mad. I always valued having the mechanical controls as a backup in the 737. I travelled in Airbus aircraft and I no longer fly but I would still hesitate to be a servant to a fly by wire system.

  • by Anonymous Coward on Monday June 08, 2009 @11:16PM (#28260983)

    Wow, if this is true, this is dumb. The plane is only going to fly as well as the data provided by its sensors. If the airspeed sensors were acting wonky ( which Airbus thinks they might have been ), then a manual override is the ONLY way to save the plane, as the plane is flying with bad data. Pilots train for all sorts of disasters, and if the sensors are providing bad data, they MUST be able to take over.

  • by Joce640k ( 829181 ) on Monday June 08, 2009 @11:18PM (#28260999) Homepage

    It was a "battle hardened" human who flew the 'plane into the middle of a massive thundercloud in the first place.

  • by Ethanol-fueled ( 1125189 ) on Monday June 08, 2009 @11:19PM (#28261007) Homepage Journal

    well, unless the plane nose dives and the computer proves/indicates it is unreliable.

    Good point. Disclaimer: I am a former Air Force avionics tech, F-15 TISS. Military fighters and civillian airliners are different beasts but I understand that the F-15 had a quad-redundant (trivia: the transporters in Star Trek: TNG have quad-redundant buffers) flight control computer.

    Google searches reveal that Airbus' flight control computers are pentuple-redundant (two primary and three secondary flight control computers).

    Another factor to take into consideration is that not all airline pilots are experienced. I don't like to dichotomize (like the poor summary of the article, dammit KDawson) but a pilot's first storm could bring hardening experience or crushing defeat.

  • by Roger W Moore ( 538166 ) on Monday June 08, 2009 @11:22PM (#28261029) Journal
    Speaking as a European it is not an irresponsible headline because, if you read the whole summary it does present a balanced case: human ingenuity vs. computer speed and multi-tasking. For example there was a mid-air collision (over Brazil?) several years ago caused by a human air traffic controller overriding the automatic collision avoidance instructions so human ingenuity is not always helpful! The fact that you got upset by this suggests that you think human ingenuity is always the best choice and you are unhappy that Airbus chose not to rely on it - which is your prejudice not the author's.

    However the snippet is wrong in that it is extremely surprising given the comparison between US and European cars where the situation is completely reversed. Drive a US car and the damn thing won't let you start the engine without a manual having to have BOTH the clutch depressed AND be in neutral which is plain stupid since either is sufficient and I usually just depressed the clutch to start the engine. Not to mention the number of times the stupid thing pings at you: put your keys in the ignition without turning on the engine *PING, PING, PING*, turn off the engine but down't take your keys out fast enough *PING, PING, PING*, put some luggage on the passenger seat *PING, PING, PING* (no seatbelt!), driver not yet irritated enough *PING, PING, PING*. Of course it also pings at you if you leave your lights on, which is useful, but by this time most people have reached under the dashboard and forcibly removed the device which goes *PING* in order to retain their sanity. This makes it about as useful as those stupid dialogue boxes that ask you "Are you sure you want to do that?".

    So given this experience I am extremely surprised that it is the opposite way around with aeroplanes.
  • by NeverVotedBush ( 1041088 ) on Monday June 08, 2009 @11:22PM (#28261031)
    The problem is that humans program the computers and design the systems that provide the sensory input to the computers and the actuating output.

    When you then put those human-designed systems in complete control, you risk the hidden design flaws and software logic errors coming out to play at some of the most inopportune times - basically bounds testing but with real human lives in the balance.
  • by Joce640k ( 829181 ) on Monday June 08, 2009 @11:26PM (#28261057) Homepage

    >"When something isn't right..." ...that part being "stabilizing controls damaged", followed three minutes later by "system that monitors speed, altitude and direction, main flight computer and wing spoilers all failed". And ... for some reason neither the pilot nor the co-pilot managed to send a radio message during that time.

    ref [yahoo.com]

    Yep. I reckon an American pilot in a Boeing could have just flipped a switch and fixed all that. They'd all be relaxing with cold ones as we speak.

  • by aXis100 ( 690904 ) on Monday June 08, 2009 @11:27PM (#28261069)

    Crossing the street and driving a car are both 'decided risks" too. It's just that infrequent but large scale fatalities generate more paranoia and subsequently bigger headlines than daily individual fatalities.

  • by gad_zuki! ( 70830 ) on Monday June 08, 2009 @11:35PM (#28261143)

    >I do not think there is much of a conflict among people familiar with the operation and implementation.

    Yes, but this is slashdot where kiddies quote Ayn Rand and Ron Paul for instant +5 insightful and quote generalizations about Americans vs Europeans which would make the worst 80s hack comedian blush. Sadly, there's probably a good thread to be made about computer controls in avionics, but instead we get flamebait from the slashdot editors at the get go.

  • by shaitand ( 626655 ) on Monday June 08, 2009 @11:53PM (#28261277) Journal

    Not to mention that the mentality that an evil computer overload or glitch will destroy us resulted in launch codes of 0000000 on all US nuclear missiles for how many years because the stupid humans couldn't remember real codes? Hell for all we know they might still be 0's and someone is patting himself on the back about how clever he is, now that they officially discovered it nobody would think the codes are still 0's!

    Are there situations where a human can save the day and a well designed computer can't? You bet. But I guarantee you for every one of those there are a thousand situations where a well designed computer will outperform a human being. Any casino will tell you, play the odds.

  • by NoobixCube ( 1133473 ) on Monday June 08, 2009 @11:56PM (#28261291) Journal

    Every time I see a Wikipedia article flagged for containing "weasel words" I think "God, give me a break, even 'weasel word' is a weasel word", but this summary should be held up as a shining example to all of exactly what a weasel word is and how much they can slant the entire tone.

  • Re:Nagoya crash (Score:5, Insightful)

    by timeOday ( 582209 ) on Monday June 08, 2009 @11:59PM (#28261321)
    Yeah, I've seen those particular examples of deadly bugs. So what? How much trouble would I have finding two examples of pilot error that killed people? The recent regional carrier crash in the US (Colgan) being an obvious example.

    A big difference is, when you fix an engineering bug, you fix it forever, and can replicate the improvement across the whole fleet. When a pilot makes a nonfatal mistake and learns from it, it adds to his experience. But that all walks out the door when he or she retires.

  • by MaskedSlacker ( 911878 ) on Tuesday June 09, 2009 @12:00AM (#28261333)

    It isn't that simple.

    Of course autopilot is safer than a pilot. Has been since they were introduced, and that will only be more and more true as time goes on.

    However, computers are really good at certain things, and really bad at others. In particular, they're good at what you program them to (assuming the programming is low on bugs, but I'm going to assume that for argument). They are very bad at dealing with situations their programmers did not anticipate.

    Humans, compared to computers at least, are very good at taking old experience and applying it to completely new situations that have never been encountered before. They certainly don't always make the right choice in that situation, but they are at least capable of making a choice.

    Are those scenarios going to be rare? Bet your ass. Take the rarity of plane crashes, and go that rare once more (I'm purely handwaving, but I just want to express how rare I think those situations actually are--astronomically rare). But they do happen.

  • by shaitand ( 626655 ) on Tuesday June 09, 2009 @12:09AM (#28261381) Journal

    They have these magic things called simulators and they are damn near perfect these days. In fact, they are so good that pilots learn to fly in them instead of in real planes now. Simulators are really just a computer program with some fancy hardware on top, flight computers can fly them and those NTSB crash reports you mentioned... Well the pencil protector wearing weenies who invented, designed, and built this shit we are talking about can load them up as test scenerios to make sure the computer can handle the types of failure that have occurred. They also load up everything their anal geek minds can think of while they are at it. Hell, maybe they even bring in some of the best pilots in the world (it isn't like they are short on bread here) and load up everything THEY can think of... but they don't stop there.

    No no no. Actually, they program the computer to think up shit that could go wrong on its own and test itself. Of course, that's great for today, but what about when shit goes wrong tomorrow? They flash the plane.

  • by identity0 ( 77976 ) on Tuesday June 09, 2009 @12:17AM (#28261445) Journal
    Too bad the trolling/ignorant summary runined this discussion. However it's based partly on fact. It's common knowledge among pilots that Boeing planes generally cater to pilot's wishes for control more than Airbus, but that has more to do with company attitudes rather than country. From this article on the crash of US Airways 1549 (an Airbus 320) and the history behind Airbus [vanityfair.com]: a charismatic French test and fighter pilot named Bernard Ziegler, now retired, who must stand as one of the great engineers of our time. He was (and is) despised within the French airline-pilots' union, because he openly discussed designing an airplane so easy to fly and crash-resistant that it would be nearly pilot-proof. He did not say "idiot-proof," but his attitude was undiplomatic in a country where pilots still wear their uniforms proudly, and it was also unwise, because, as the record has repeatedly shown, if you emphasize to pilots that they are flying a safe design, they will go to great lengths to prove you wrong. In any case, Ziegler had to live under police protection because emotions grew so strong. So clearly, the French take the idea of pilot control just as seriously as Americans do, but Airbus opted to go a different route. I have no idea what the other American and French companies (some now defunct) like Lockheed, Aerospatiale, etc are like.
  • by |>>? ( 157144 ) on Tuesday June 09, 2009 @12:44AM (#28261609) Homepage

    A well trained pilot would know when to trust the computers and when not to. They would also know how to maneuver and react in situations. It's like the pilot that landed his plane in the river after losing an engine to birds. I don't think a computer would have taken that option and not only would it have been likely that all the passengers would have been killed, but bystanders as the planes computer attempted to correct and eventually goes down in a populated street.

    This comment looks sensible on the face of it, but I have to disagree with you. I have a pilot license and am familiar with the process of flying. I've never flown a fly-by-wire aircraft, but I've automated a radio broadcast desk - which might not look like it's relevant, but it taught me that "knowing when to trust the computer" is not an obvious state, not in a radio station and I seriously doubt in a cockpit.

    For me the final "aha moment" came when the computer was attempting to tell me something useful, but because I was concentrating on a completely different aspect of interacting with it, I completely missed the information. In my case it caused a few seconds of dead air on a radio station, nothing life threatening, but not human obvious either.

    The challenge is not "when to trust a computer and when not to" - the challenge is "how do you get the information that the computer is using to the human in such a way that they can manage that input stream in a timely fashion. Stick shakers are an example of making use of an extra input channel.

    Accidents in planes are rarely just one thing going wrong, they generally are a whole string of things. A computer in the mix just exacerbates the issue.

  • by sam0737 ( 648914 ) <samNO@SPAMchowchi.com> on Tuesday June 09, 2009 @12:45AM (#28261613)

    Especially which limitation you would like to override? It's unclear that if the accident was due to the inability that pilot's unable to override tho computers' limit.

    How about this? http://en.wikipedia.org/wiki/China_Airlines_Flight_006 [wikipedia.org]. A Boeing flight, which pilot's manoeuver resulted in a 5G load and barely destroy the horizontal stabilizers. With less luck, it the CA006 might lost the whole stabilizers and could result in JA123 - http://en.wikipedia.org/wiki/Japan_Airlines_Flight_123 [wikipedia.org].

    May be without the 5G manoeuver the plane will be lost, but I doubt if it was an Airbus, could the plane becomes >66 deg bank in the first place (due to the protection of another computer limitation), and hence such 5G manoeuver would not needed.

    From the article,

    riding around on autopilot all the time pushing buttons does nothing to sharpen your hand flying skills for a possible situation like this when you will need it the most

    But I prefer pilots to sharpen and practice the emergency handling skills with simulator...not the real thing.

    Flying manually without autopilot in the turbulence is like driving at 100mph on icy road without electronic traction control. I still think computer is in a better role in handling that.

  • by avxo ( 861854 ) on Tuesday June 09, 2009 @12:52AM (#28261647)

    An Airbus is a flawed system created by flawed humans.

    And what does this retarded tautology prove? That imperfect engineers designed imperfect planes flown by imperfect pilots? Humans are imperfect and therefore susceptible to making arithmetic mistakes too. But that doesn't mean that when your grade-school teacher teaches you that 2 + 2 is 4 next week, you can stand up and say: "Well, maybe it's not because you are imperfect and you could be making a mistake."

    When an unexpected bug manifests itself you want the computational device with the most functional capabilities (that is, the human machine which is a superset of the Turing machine) in control.

    As an engineer I know that bugs will manifest themselves -- whether that bug is a pilot having a stroke, a computer botching an FDIV, a ball bearing failing or whatever else. It will, eventually happen. I won't make blanket statements that a pilot is the device with the most functional capabilities. All I know is what I, as an engineer, can do. So at the design phase, I strive to ensure the highest possible degree of fault-tolerance that I can possibly work into the device, that I carefully work through everything to ensure that I haven't missed something. I'll oversee the implementation to ensure that my specs are followed. And then I will take a handful of finished devices are run them through torture tests. It's all this flawed human can do.

    You blindly assert that the pilot is the one with the most functional capabilities. I call bullshit. Computers, in general, are much better at flying planes. Barring a malfunction (something that the pilot himself is susceptible too) they will always generate the same output given the same input; their actions are the result of the calculations of aeronautical engineers and the distilled experience of hundreds of thousands of flight hours by pilots all over the world; they will integrate many more variables than a human can possibly integrate and still their reaction time will considerably faster than even the fastest human pilot. Computers also help pilots by automatically compensating for damage (e.g. the loss of an aileron) leaving the pilot free to actually get the plane safely to the ground. In 2005 Airbus flight control computers compensated for significant damage to an aircraft, allowing the pilots enough control to safely land a plane without a rudder. I'm sure there are other cases I can't think of right this instant.

    Consider the tragedy that was Helios Airways Flight 522 [wikipedia.org]. Our understanding of that accident is that the pilots became disabled after the cabin depressurized. They seemed to disregard warnings from onboard computers and failed to either recognize the effects of hypoxia and take steps to correct the problem. The computer was the more functional computational device in this case. Arguably, it should have detected the depressurization, issued automated warnings to ATC, and if unable to interact with the pilots, it should have reduced altitude, provided it was safe to do so.

    As long as a pilot needs to be in the cockpit, a pilot needs to be able to shut the computer off.

    The pilot and the computer work synergistically and with modern FBW planes there's no way to shut the computer off, but pilots already have override authority and are able to cause the airplane to exceed its performance envelope if necessary. The only difference is that Airbus flight control computers are more aggressive in what they will allow than their Boeing counterparts.

  • by theycallmeB ( 606963 ) on Tuesday June 09, 2009 @12:55AM (#28261661)
    And the article author and the summary are both full of it.

    Perhaps not the most diplomatic response, but it is true enough.

    First, there is absolutely nothing conclusive to say about Air France 447 at this point other than it did indeed crash (thus ruling out alien abduction and time travel). There are no conclusions, or even anything that could really be called a theory, just guesses and hunches ranging from informed to wild-arsed. At this point, nobody can even be certain as to whether the mismatches in indicated airspeed happened before or after the aircraft started to break up. As WAG level example, if lightning had damaged the radome at the nose of the aircraft (has been known to happen), then the three pitot probes could report different velocities not because the probes failed, but rather because to aircraft no longer conforms to the aerodynamic profile the pitot probes are calibrated for.

    Also, the difference between Boeing and Airbus is not as stark as the author would like to think. On both manufacturers' most recent aircraft, in normal flight the computers will automatically do a variety of nifty things (like auto-mixing the use aileron/rudder inputs, vertical gust load alleviation, etc, to increase efficiency and comfort) in ways entirely transparent to the crew. The differences are at the extreme limits of the flight control laws. There, if the pilots pull on the controls hard enough, a Boeing plane should accept the input even when the computer thinks the input will cause permanent, or even fatal, damage to the aircraft (it will warn the crew, loudly). An Airbus plane will limit the input so as to avoid such damage (and notify the crew it is doing so). There are legitimate arguments for both configurations, and America vs Europe has nothing to do with it (old dog vs new pup might, if you could go so far as to call Airbus a new pup). At the extreme limits it is not a matter of ingenuity versus information, but more of protecting what you have left right now (an unbroken airplane in danger of crashing), or allowing risks that might let you get to a better place (a damaged, but perhaps un-crashed (for now) airplane).

    In either case, by the time a flight crew encounters the philosophical differences between Boeing's and Airbus' respective control laws, they are already frakked, and in a damned if you do, damned if you don't scenario.

    In both cases, part of the flight computers programming is there to monitor itself, and its sensors, for failures that would compromise its function. In a situation where the airspeed indicators no longer agree with each other, the computer should automatically reduce any limiting role it has because the computers' input data is no longer reliable. And as current commercial airliners are reasonably stable in the aerodynamic sense, they can continue to fly even in the event of a total computer failure. Look carefully at cockpit pictures of the shiny new Airbus A380 and you will see a small cluster of old fashioned instruments amongst all the flat panel displays. The computer can fail, and of all the things on an airliner, the computer is the item most aware of this.
  • Re:Nagoya crash (Score:5, Insightful)

    by timeOday ( 582209 ) on Tuesday June 09, 2009 @01:11AM (#28261731)
    China Airlines Flight 140, cited above, is an example of pilot error overriding autopilot causing a crash. The plane crashed because one pilot pressed the takeoff/go-around button, then the other pilot fought the autopilot, driving the plane into the ground. Apparently the plane would have been fine had they simply let it do what they told it to do.

    That alone makes the anectodal score 1 to 1.

    Almost any incident of controlled flight into terrain also counts, since autopilots are very good at not absent-mindedly flying into the ground. Eastern Airlines flight 401, which crashed into the everglades in 1972, is an example of this. The pilot accidently turned off the altitude hold autopilot and the continued to let the plane fly right into the ground.

  • GIGO (Score:3, Insightful)

    by hoggoth ( 414195 ) on Tuesday June 09, 2009 @01:13AM (#28261749) Journal

    > It's essentially a question of what do you trust most: a human being's ingenuity or a computer's infinitely faster access and reaction to information.

    No, it's who do you trust most: a human pilot's ingenuity in reacting to a novel situation or a human programmer's foresight in accounting for every possible situation.

  • Poorly Researched (Score:1, Insightful)

    by Anonymous Coward on Tuesday June 09, 2009 @01:27AM (#28261843)

    This article is very poorly researched. Even the briefest bit of research would show that although, yes, airbus planes do generally operate under full fly by wire envelope protection, that is not the full story at all. Airbus planes have several fall back modes whose level of computer intervention vary depending on the number of the system failures, or by choice of the pilot, meaning that the pilot is free to select any of these modes at will. One of these, called "Direct Law," gives full and direct control of the flight control surfaces to the pilot, with no computer intervention. There is one additional mode, "Mechanical Law" which gives even lower level control to the pilot.

    In the case of the AF crash the airplane had automatically downgraded its FBW level to "Alternate Law" which is a bit in between "Normal Law" (full FBW) and "Direct Law"

    For more information (this happened to be the first Google search result.. there are plenty more) :
    http://www.airbusdriver.net/airbus_fltlaws.htm

  • by Culture20 ( 968837 ) on Tuesday June 09, 2009 @01:38AM (#28261905)
    pentuple-redundant flight control computers, but how many sensors? If all five computers get the same wrong data, they'll all agree on the same wrong course of action.
  • by syousef ( 465911 ) on Tuesday June 09, 2009 @01:40AM (#28261913) Journal

    On the other hand, the flight computer has the experience of every simulated and real emergency any plane has ever been through. Sure, humans can practice in the simulator as well, but the reality is that costs mean that no individual gets that much time in the simulator

    What utter nonsense! All the computer has is a set of heuristics derived from various situations that have been selected by its human programmers to represent the set of scenarios likely to be encountered. The heuristics aren't perfect. The choices made by the programmers aren't perfect. The computer has no magic database of all accidents that you describe. How the FUCK does this lame bullshit get modded up?

    Due to the magic of software when one flight computer knows how to handle some situation, they all do.

    Are you even paying attention to what you're typing? You're trying to be clever by using the term "magic" to encompass all the knowledge the computers encapsulate, but you've done so in such a way that it makes you sound like a fool who believes there's literally something magical about the software.

    Computers can ONLY do what they're programmed to do. If the situation encountered is not one that was planned for and tested, the computer can make stupid nonsensical judgements that no human of sound mind would ever contemplate making. There's no sophisticated AI flying the computer that understands the context of the flight (even if there are "AI" components in the flight programming).

  • by mrcleaver ( 738705 ) on Tuesday June 09, 2009 @02:03AM (#28262031) Journal
    The soviet army didn't have medals or anything else during the early stages of WWII. You can do your own research on the topic, but overall it didn't work well for them (along with a lot of other things). It turns out people like being rewarded (even if it''s just a colorful emblem) for doing their job well. At the end of the war the Soviets had a whole bunch of different medals for their soldiers and realized that sometimes it's good to not treat everyone as equals. Anyway, just because others could have done what Sully had done doesn't mean Sully isn't a hero, Sully ACTUALLY did it, he actually saved hundreds of lives doing what he did and water landings are not a common occurrence at all. And just because I have the capability of pulling a little girl away from the train tracks doesn't mean I should get the same attention as the guy who actually pulled it off or that he should get less credit for doing a good deed.
  • by Suzuran ( 163234 ) on Tuesday June 09, 2009 @02:18AM (#28262113)
    No.

    It fails #1 because in Normal Law there is no protection against allowing the airplane to fly into the ground. The computer knows its height above terrain (radar altimeter) and will provide warnings, but it won't stop descending. Of course, they could patch that in...
    There is also the case of allowing the aircraft to run out of fuel. Maybe program it to find the nearest suitable airport and set up an autoland? This would be an interesting experiment.

    That's just off the top of my head. Laws two and three are fine since the computer won't generate commands of its own accord anyway (the humans DID input the flight plan and FMS parameters, after all) and doesn't have weapons with which to kill people, but the "inaction" part of rule 1 is a bear.
  • I would argue that "simple fact". IMHO any pilot who decides to fly directly into a large thunderstorm when going over it is a viable alternative has already committed pilot error, the computer probably let him fly further before crashing than he would have solo.

    I would rather trust my life to a computer whose bugs can be ironed out and which will always perform the same way in a situation than a pilot who may or may not have gotten enough sleep, be drunk, or somehow be distracted. I've seen enough car crashes to know that humans are not the godlike infallible beings that the anti-computer controlled planes group seems to be preaching. Pilot error was blamed for almost 80% of crashes in '04...why would you want to trust your life to something that statistics alone dictate to be more likely to crash?

  • by Joce640k ( 829181 ) on Tuesday June 09, 2009 @02:48AM (#28262279) Homepage

    So.... are we saying that a human pilot should be allowed to fly a plane in a 25 knot window?

    I hope not.

  • by Joce640k ( 829181 ) on Tuesday June 09, 2009 @03:03AM (#28262355) Homepage

    >"we don't know if the Brazilian crash has anything to do with this."

    Well, looking at the automated messages sent by the computer (nb. we didn't hear anything from either the 'infallible' pilot or copilot) I'd say it was damage to the plane, not computer error.

    >I'd like to see a computer know to, and successfully land in the Hudson though!

    Computers don't take decisions on *where* to fly, only how to set the controls. It was a human who flew into the middle of a giant thundercloud, not a computer.

    And, errrr the Hudson landing thing was done in an Airbus. Somehow the pilot managed to steer and land an Airbus with no engines even though the computers were fighting him and obstructing his every move.

    Or maybe they weren't.

  • by Anonymous Coward on Tuesday June 09, 2009 @03:28AM (#28262485)

    No warning alarm? Bollocks...go read the Airbus FCOM.

    When ever the a/p disengages both pilots receive an alarm...its a bloody great big red, flashing light plus aural alarm

    FFS!

  • by rew ( 6140 ) <r.e.wolff@BitWizard.nl> on Tuesday June 09, 2009 @03:40AM (#28262541) Homepage

    Although I agree that giving computers some control about the plane is a good idea, there is a tendency to blame everything on pilot error. In theory it is possible to fly the plane to the destination, so if the pilots get the plane in trouble, it's always pilot error.

    Say turkish airlines this february near schiphol, the Netherlands. Altimeter fails -> plane decides to do weird things -> pilots react too late. (actually pilot-in-training reacts on time, captain takes over, and doesn't expect the autopilot to engage again, and give the plane the wrong commands again!).

    It starts with a failure in the plane, but the pilots end up screwing up. If you scream "pilot error", airbus, boeing and the airlines don't have to engage in expensive redesigns/fixes. Just a note to send to all the pilots that they shouldn't do whatever brought down the latest crash.

  • by MadnessASAP ( 1052274 ) <madnessasap@gmail.com> on Tuesday June 09, 2009 @04:25AM (#28262769)

    Theres a good reason for this, I'm sure you can guess it so I'll give you a few seconds to come up with it.
    .
    .
    .
    .
    .
    .
    .
    Got it? No? Oh well then let me tell you. It's because you're not supposed to disable it... ever.... at all... under any circumstance.
    The computer is designed to let you push the plane to the limits of it's safe operational envelope and no further. In the very unlikely case that the computer/hardware faults it will automatically switch itself to a series of fallbacks all the way to the closest you can reasonably get to a manual direct link control and it will inform the pilot that it has done so. Believe me these systems are designed by large teams of engineers who have studies this far more then you, I or just about anybody else on Slashdot and they do not fuck around.

  • by Toreo asesino ( 951231 ) on Tuesday June 09, 2009 @05:17AM (#28263017) Journal

    From where I'm sitting, it seems boeings fall out the sky with more often and with more devastating results than Airbuses - http://news.bbc.co.uk/2/hi/in_depth/2008892.stm [bbc.co.uk]

    I particularly liked when the A320 came down in the Hudson how, it was "all thanks to the pilot"...and yes, in part it was, but the minute another airbus falls out the sky and it's fatal this time (as crashes often are) it's clearly because of poor design philosophy?

    Meh, this whole thing stinks of US vs EU chest-bashing.

  • Habsheim crash (Score:2, Insightful)

    by AlecC ( 512609 ) <aleccawley@gmail.com> on Tuesday June 09, 2009 @05:25AM (#28263055)

    This is not correct. The engines naturally take a finite time to spool up from low power to climb - about nine seconds. What the computer refused to do was allow the pilot to pull the nose up into what would, at the low speed the plane was at, have been a stalling attitude. It is arguable that, had it allowed it, the plane just might have been able to "bunny hop" the trees and recover as the engines spooled up. More likely, it would have lurched up, stalled, and crashed more violently than it actually did.

    This was a classic case of computer-induced overconfidence. The pilot assumed that the computer would not let him make a mistake, and set the controls for to fly as slowly as the computers would let it. Which gave it no spare energy to climb out of trouble. But the computer could not "see" the trees at the end of the runway. As one commentator put it, the pilot flew the plane into a hole in the ground, trusting vainly in the computer to get him out of the impossible state he put the plane into.

  • by stjobe ( 78285 ) on Tuesday June 09, 2009 @05:46AM (#28263161) Homepage

    (Also doesn't help airbus that they seem to be having many more crashes then Boeing over the last five years).

    It might SEEM that way, but the FACTS state that Boeing have had quite a few more crashes than Airbus over the last five years and a lot more if we go back even further.

    A simple Google query would have told you this.

  • by IamTheRealMike ( 537420 ) on Tuesday June 09, 2009 @06:52AM (#28263459)

    Hmm. Sounds like the real problem here is that autopilots are not built to explain their decisions? I mean, what if there was a reason for the auto-pilot to be doing those things that made sense, you just didn't know what they were? Wasn't one of the air disasters mentioned earlier in the discussion where the auto-pilot dived to maintain speed after there was ice on the plane, and the pilot overrode it because he didn't understand why?

I've noticed several design suggestions in your code.

Working...