Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Transportation Education Software

Airline Pilots Rely Too Much On Automation, Says Safety Panel 270

Hugh Pickens DOT Com writes "Nearly all people connected to the aviation industry agree that automation has helped to dramatically improve airline safety over the past 30 years but Tom Costello reports at NBC News that according to a new Federal Aviation Administration report commercial airline pilots rely too much on automation in the cockpit and are losing basic flying skills. Relying too heavily on computer-driven flight decks now poses the biggest threats to airliner safety world-wide, the study concluded. The results can range from degraded manual-flying skills to poor decision-making to possible erosion of confidence among some aviators when automation abruptly malfunctions or disconnects during an emergency. 'Pilots sometimes rely too much on automated systems,' says the report adding that some pilots 'lack sufficient or in-depth knowledge and skills' to properly control their plane's trajectory. Basic piloting errors are thought to have contributed to the crash of an Air France Airbus A330 plane over the Atlantic in 2009, which killed all 228 aboard, as well as a commuter plane crash in Buffalo, NY, that same year. Tom Casey, a retired airline pilot who flew the giant Boeing 777, said he once kept track of how rarely he had to touch the controls on an auto-pilot flight from New York to London. From takeoff to landing, he said he only had to touch the controls seven times. 'There were seven moments when I actually touched the airplane — and the plane flew beautifully,' he said. 'Now that is being in command of a system, of wonderful computers that do a great job — but that isn't flying.' Real flying is exemplified by Capt. Chesley Sullenberger, says Casey, who famously landed his US Airways plane without engines on the Hudson River and saved all the passengers in what came to be known as the 'Miracle on the Hudson.' The new report calls for more manual flying by pilots — in the cockpit and in simulations. The FAA says the agency and industry representatives will work on next steps to make training programs stronger in the interest of safety."
This discussion has been archived. No new comments can be posted.

Airline Pilots Rely Too Much On Automation, Says Safety Panel

Comments Filter:
  • It goes both ways (Score:5, Interesting)

    by dunkelfalke ( 91624 ) on Thursday November 21, 2013 @09:16AM (#45480435)

    http://en.wikipedia.org/wiki/Aeroflot_Flight_593 [wikipedia.org]

    "Despite the struggles of both pilots to save the aircraft, it was later concluded that if they had just let go of the control column, the autopilot would have automatically taken action to prevent stalling, thus avoiding the accident"

    And reading this:
    http://en.wikipedia.org/wiki/Pulkovo_Aviation_Enterprise_Flight_612 [wikipedia.org]

    I'd rather have a computer flying the airplane I am sitting in, than a hairless ape.

  • by Chrisq ( 894406 ) on Thursday November 21, 2013 @09:22AM (#45480477)
    In a crash the fact that is an airbus has to be mentioned. When an airbus behaves under dramatic conditions it becomes a "US Airways plane"!
  • Re:In the SIMULATOR? (Score:3, Interesting)

    by Shinobi ( 19308 ) on Thursday November 21, 2013 @09:26AM (#45480499)

    A human can get an appreciation of velocity even without working pitot tubes, in a middle of a weather system where GPS doesn't work. The flight computer can't handle that, which is why it disconnected and warned in the case of the Air France flight.

    In the case of the Hudson River landing, bird strikes took out both engines simultaneously, killing power. Pilot manually switched over to APU. Ironically however, in that case, the computer helped the pilot ditch the plane safely, once it had power again. With just the pilot, or just the flight computers, there would most likely have been dead people in the water.

  • by TheloniousToady ( 3343045 ) on Thursday November 21, 2013 @09:30AM (#45480515)

    Likewise, the automation is not designed to handle extreme failures of the aircraft. For example, the situation many years ago in Iowa where the hydraulics failed and the pilot had to steer the plane using only the engine throttles is an example of something that no computer system is designed to do. Yet a veteran pilot managed to pull it off.

  • by Loundry ( 4143 ) on Thursday November 21, 2013 @09:30AM (#45480517) Journal

    My son is 13 years old and has been training to be a pilot since he was 11. He has taken off and landed a small airplane (with the PIC in the airplane with him, of course) quite a few times. It just goes to show that landing an airplane isn't as difficult as some people think it is ... it just requires focus and passion. Both of which my son has in spades when he's flying an airplane.

    This news story struck me as wonderful news. My son has wanted to be a pilot since he was three years old. If you are one of the lucky few (I am not) who knew what he wanted to be for his whole life, then I envy you as much as I envy my son for having a singular great dream. The notion of drones and computerized pilots scares me because it threatens that dream. Stories in which autopilots and drones are slandered make me happy.

  • The Problems (Score:5, Interesting)

    by Anonymous Coward on Thursday November 21, 2013 @09:37AM (#45480557)
    I'm a manager at a world leading flight training company targeting major airlines all around the world, we train cadets from scratch on small aircraft and flight simulators in order to develop these basic skills and beyond (eg: ATPL and HPAT, type specific training etc.). I assist with developing syllabi and ensuring their compliance with numerous safety authorities all over the world. We looked into the Air France disaster to see how we can improve out syllabi to give students the skills to handle these atypical situations. To make a long story, the growing trend for airlines to want to cut costs on training and even remove what they call "unnecessary" training from syllabi is what is leading to this problem. The MPL is the prime example of this, this is my solution:
    - Stop treating us like a factory, each student is different and can they can take longer to learn certain concepts. Fixed length integrated courses don't work if they don't have good margins for this.

    - English is the language of aviation. If you bring us cadets who can't speak it, we have to teach them english within your timetable which degrades outcomes.
    - Redo the MPL and bring back spinning, hand and feet skills etc.
    - Whilst the MPL has a heavy focus on simulators, it needs to be a much bigger part of their renewals and professional development in order to re-enforce what they learnt during early stages of their career and training when they start working.
    - Some airlines have poor quality control in their recruitment phases, is susceptible to corruption or have too many "token" cadets. Some people just aren't cut out to be pilots, identify this early not late.
    - Airline and safety authority audits are a joke, Standards/QA Manager(s) should be mandatory, I've seen our competitors teach students very bad techniques because of a bad instructor or two and it poisons entire batches of students. Auditing needs to be proactive, integrated into systems and workflows and not just a visit a few times a year. to look through paper records or merely reactive in the case of a safety incident.
    Remember, the training doesn't stop when the student is finished their course. Operators and manufacturer (Airbus, I'm looking at you) need to stop treating pilots like bus drivers and focusing only on fuel optimisation.
    - This is minor but still important. Shock material. We aren't allowed to show students the imagery of air disasters any more. They can be and usually are gruesome by statistically effective, safety incidents in classes that were shown this material were halved compared to classes that weren't.

    This opinion is my own and doesn't reflect that of my employer, doing it anonymously because our media policy prohibits these types of comments. I'd love to hear people's feedback on how training could be furthered improved, it's what gets me up in the morning, trying to fight the system.
  • Re:It goes both ways (Score:5, Interesting)

    by nedlohs ( 1335013 ) on Thursday November 21, 2013 @09:45AM (#45480611)

    But then you have things like http://en.wikipedia.org/wiki/Turkish_Airlines_Flight_1951 [wikipedia.org] in which the autopilot decided that 2000 feet high was a good place to do a landing flare, shortly followed by the expected plummet to the ground.

    What you should rather have is the computer flying the plane with a competent human pilot to save the day when something goes wrong (usually with the various sensors the computer it using). But of course, and it's what the article is about, if the plane is almost always under computer control how do you keep the human pilots competent. Since, as you're examples point out and my example points out, incompetent crews make things worse and don't save the day when the computer has issues either.

  • Re:In the SIMULATOR? (Score:5, Interesting)

    by Shinobi ( 19308 ) on Thursday November 21, 2013 @10:05AM (#45480739)

    "The flight computer can't handle that yet. I mean, where comes the human's appreciation of velocity from? Well, three sources: Experience (which is just collected data), knowledge of physical relations (that's the easiest thing to program in), and experience from the senses (which is essentially sensor data). Nothing which could not be replicated in software. The point is that the computer would have to be programmed to estimate missing data from one sensor from available data from other sensors (and also simple check routines to estimate the reliability of data; but I guess they are already built in, to know when to give up control to the pilot). The more sensors are available, the better."

    The computer is already programmed to use multiple sensors, such as multiple pitot tubes for example. Despite research, pitot tubes are still the most reliable sensors we have for this application, GPS is way too unreliable. And in case of Air France, all 3 pitot tubes froze over, making the flight computer completely blind(And forget about GPS or other radio based navigational aid in the weather they were in, in the region they were in...)

    Also, experience is not just collected data. Experience is the knowledge extracted through sifting and analysis of the collected data, and perhaps generalised and abstracted upon also, to possibly be adapted in whole or part to other situations. A rookie trooper that's gone through training has collected lots of data. But the trooper is still completely inexperienced until he or she has been through the real deal, and seen what works, what didn't work, how it worked, and what can be learned from it. Same thing with pilots. To equate a pilots decision making, you'd need a beefy cluster to handle the expert system, image recognition, processing all the sensor data to give better spatial awareness, and recognize for example an improvised landing strip that is suitable.

  • by Anonymous Coward on Thursday November 21, 2013 @10:23AM (#45480883)

    More automation already means that the pilots gain less experience, including in unforeseen circumstances. That was exactly what the AF447 crew ran into. The juniors didn't catch on and when the old man finally got back, he didn't gain oversight in time either. A veteran pilot would've been able to pull the thing out of its deathly course, provided he'd known what was going on. Worse, the automation will mean there will be less pilots of such veteran ability around.

    So this is a bit of a turning point. More automation, and then better work hard on making it able to handle as many situations as possible, not just the common ones. Then give it full authority. Or more emphasis on pilot training, and having them fly often so they keep current. The middle way would be both, which is probably harder to do well.

  • by tibit ( 1762298 ) on Thursday November 21, 2013 @11:01AM (#45481237)

    It's not about staying in practice. The problem is much more immediate. In order to interact with any system when you're to be part of the control loop, your brain needs to be preset for control. That means you need to know and feel exactly in what state is the system you're going to take control over. It's very hard to maintain this awareness if you're not actually controlling the process. You need to be ahead of the plane, so to speak.

    This very same problem is present in all of man-machine interaction when control tasks are involved. This is the reason, for example, that "taking over" a self-driving car while it is underway is pointless: you need to be pretty much driving the car without actually driving it - so you might as well be the driver without the self-driving brouhaha. Otherwise by the time you figure what's going on, you'll be dead. You can only take over a self-driving car when it's stopped. Even then you'll be quite likely to get lost or to execute a wrong turn/maneouver since you're unlikely to know where you are - unless you're on a road you frequent.

    What it really boils down to is something else entirely: people use "common sense" to judge things that they have zero experience with. If you ask "common sense", it would be "cool" to have self-flying planes, self-driving cars, etc. But common sense is precisely the wrong one to make judgment about such things. Reality is quite far from common sense, until you had a chance to experience it just so. The common-sense widely-spread non-specialist thinking about self-controlling systems is usually wildly off-base. Reality is under no obligation to make sense to anyone, so to speak. Thus some things that should be "common-sense-easy" are very far from being so. Self-controlling systems often bring with them a whole lot of extra issues that nobody had any idea of until they've faced them. Aviation industry has only recently went out of automation-related self-denial. 20 years after it was all understood. That's the risk of relying on common sense over facts.

  • Re:In the SIMULATOR? (Score:5, Interesting)

    by bickerdyke ( 670000 ) on Thursday November 21, 2013 @11:05AM (#45481269)

    I think automated cars would have to cope with far MORE variables and complications.

    Planes receive a unique flightplan and detailed instructions for take of and landing that are steered by a central traffic control to make sure that there won't be any other planes nearby. Thats possible because EVERY plane has to receive instructions from them.

    So basically, automated planes would not need to consider other planes. They do in a rather simple way (TCASS) but only as a last line of defense. And even if that emergency system triggers, it sends one plane "up" and the other "down", which are obviously no evasion options for cars. (And blindly going "left" and "right" aren't options either as usually on roads, you have to expect curbs, trenches, more cars in more lanes or pedestrians)

    Additionally, all information needed for a plane is already available in electronic maps. Pilots hardly have to react to speed limits posted on traffic signs. (Which my be dirty or partly shielded and all that stuff)

    The final proof is even in the summary: We already have commercial airplanes that fly almost completly automated! (having to touch the actual controls no more than 5 times between NY and London is almost completly automatic!) whereas automated cars were unthinkable untill a few years ago and today they're not completly from "experimental" to "testing" stages.

    But there is ONE THING that makes autonomous cars safer than planes: Cutting of the engine is a safe failure mode. (Espescially if it can be propagated to surrounding cars by radio, so blindly jumping out of your exploding Tesla onto a busy highway is rather safe when information about an emergency stop has been broadcasted to the cars around and they stop, too)

  • Disagree (Score:4, Interesting)

    by DaMattster ( 977781 ) on Thursday November 21, 2013 @11:16AM (#45481367)
    Actually, I have several friends that are Airbus 330 and 321 captains that actually would like to have more control over the plane and less automation. To some degree, they are hamstrung by the company, Airbus Industrie, that is relegating to pilots to "flight management" duties instead of actually "stick and rudder" flying. Most pilots I know lean towards type A and would much rather have control over their plane then hand it over to avionics and flight management systems.
  • by SomeoneFromBelgium ( 3420851 ) on Thursday November 21, 2013 @11:38AM (#45481633)

    In a rare surge of honnesty the Discovery Channel reconstruction of the Hudson miracle concedes that a the fact that the pilot had activated the APU meaning that the advance anti stall protection on the A320 was active clearly contributed to the miracle.
    At the time of hitting the water the plane whas flying slower and with a higher angle of attack than a human would safely be able to do...

  • by Savage-Rabbit ( 308260 ) on Thursday November 21, 2013 @11:46AM (#45481725)

    Automation fails from time to time, and when it does, pilots are the failsafe. But to be able to do that, they need to stay in practice, and that's the problem being highlighted here: they're getting so little time in control that they're getting out of shape.

    Right, build or contract a small fleet of trainers (perhaps twin turboprops or two seaters trainers like PC.7s or Tucanos or even set aside an old 737 or something in that class) and make these people fly their ass off once in a while. I'm sure simulators are great learning tools but there is no substitute for taking a plane up and actually practicing things like: engine restarts, flying on one engine, simulating an emergency descent after a rapid decompression or just boning up on basic aerobatics (the value of practical experience is one of a number of reasons the military hasn't replaced exercises like Maple Flag with simulater-only LAN partys). That should take care of any 'bureaucratification' problems your pilots are suffering from.

  • by jcdr ( 178250 ) on Thursday November 21, 2013 @12:57PM (#45482437)

    First sorry for my English.

    Automation work very well in fully tested conditions and bring many advantage in term of safety, cost and comfort. The problem is that the real life is not always contained into the fully tested conditions, and that even with an massive and continuous development effort, this assertion will never be proved false.

    The current state of the aircraft operation is that basically the human endorse the full responsibility to engage the automation, monitor his work, disable it in case it is not appropriate, and manually operate the aircraft. This is manly because the today automation level don't include the capabilities to replace the human for those meta-tasks. But there is technically no reason to not includes them, and I believe that the future automation will take this direction. The consequence is that the human will have even less opportunities to operate an aircraft in sustainables conditions and that the remaining out of tested condition case will be so unmanageables situations that only a few exceptional pilots will eventually be able to survive. Until this extreme level of automation is in operation, we will inevitably see pilot error due to untrained operation like in the AF443, like in Kazan a few day ago, like many others accidents...

    What is important to understand here is that the concept of "untrained operation" (or not enough) for an human is not so different from the concept of "untested condition" for an automatic system. From the aircraft essential operations like aerodynamic and motors, this make no difference if the action (or inaction) in from a human or from a computer. The point is to how to know what is the good action to do at each time in the operation. The only solution here it to have a very very depth knowledge in a lot of specific fields, a massive quantity of information to choose from, and a very quick reaction time to analyse all of them. Human brain can archive fantastic things from the eyes of others humans, but have still several hug limitations. He is specifically unable to focus on a task for a long time, sensible to external stress, limited in his precision and repeatability, and usually slow and error prone in untrained operation. An automation yield better result for most of those metrics, but is completely unable to handle untrained operations (out of tested conditions).

    Did you get the idea ? Having a slow and error prone human trying to resolve untrained operation is better than having only an automation that will do nothing relevant at all. This is what we commonly call intelligence: trying to solve something new. Just a note: while our human body have evolved to integrate some basic survival action generator in case of emergency situation, there are really not effective for an today aircraft operation; don't mix them with the required intelligence. At this stage you maybe feel the problem: Out of the automation tested conditions, automation is for nothing, and human is a mediocre performer, but we have no other choice yet. Having the pilot trained to replace the automation working into tested conditions is not the solution, because the real problem don't lie into the tested conditions, but outside of them.

    Now a level higher. Training a pilot on a unexpected situation is a long process. From a very general point of view, you can decompose this process into some basic parts: 1) recognize the situation; 2) select the appropriate action; 3) do the selected action. In practice this is implemented into a written procedure and the pilot train this procedure. What is important to understand here is that this way of training the pilot is to make an unexpected situation managed more by his experience than by his intelligence, because experience is fast, while intelligence is slow. We essentially try to extend the "tested condition" manageable by his brain, much like we can extend the tested condition of an automate. I predict that in the future, the computers will be less limited than the human brain in the extension of the teste

  • by NoImNotNineVolt ( 832851 ) on Thursday November 21, 2013 @02:07PM (#45483027) Homepage

    a heavily modified 747 Boeing uses for cargo hauling it is manufacturing process.

    Okay, incorrect usage of "it's" in place of "its" is irritating enough, but expanding it to "it is"?!?!?

HELP!!!! I'm being held prisoner in /usr/games/lib!

Working...