The Plane Paradox: More Automation Should Mean More Training

Shortly after a Smartlynx Estonian Airbus 320 took off on February 28, 2018, all 4 of the plane’s flight management computer systems stopped working. Every carried out exactly as designed, taking themselves offline after (incorrectly) sensing a fault. The downside, later found, was an actuator that had been serviced with oil that was too viscous. A design created to stop an issue created an issue. Solely the talent of the teacher pilot on board prevented a deadly crash.

Now, because the Boeing 737 MAX returns to the skies worldwide following a 21-month grounding, flight coaching and design are within the crosshairs. Making certain a protected way forward for aviation in the end requires a completely new method to automation design utilizing strategies based mostly on system idea, however planes with that expertise are 10 to fifteen years off. For now we have to prepare pilots the right way to higher reply to automation’s many inevitable quirks.



Captain Shem Malmquist, a veteran security and aviation accident investigator and present B-777 captain, teaches on the Florida Institute of Expertise and is coauthor of Angle of Assault and Grounded (Lexographic Press) with Roger Rapoport, producer of the characteristic movie Pilot Error.

In researching the MAX, Air France 447, and different crashes, now we have spoken with lots of of pilots, and consultants at regulatory businesses, producers, and high aviation universities. They agree that one of the best ways to stop accidents within the quick time period is to show pilots the right way to creatively deal with extra surprises.

Gradual response to overdue pilot coaching and design reform is a persistent downside. In 2016, a full seven years after Air France 447 went down within the South Atlantic, airways worldwide started retraining pilots on a brand new method to dealing with high-altitude aerodynamic stalls. Simulator coaching that Boeing satisfied regulators was pointless for 737 MAX crews started solely after the MAX’s second crash, in 2019.

These treatments solely handle these two particular eventualities. Tons of of different unexpected automated-related challenges might be on the market that can’t be anticipated utilizing conventional risk-analysis strategies however prior to now have included components reminiscent of a pc stopping the usage of thrust reverse when it “thought” the airplane had not landed. An efficient answer must transcend the restrictions of plane designers who’re unable to create the right fail-safe jet. As Captain Chesley Sullenberger factors out, automation won’t ever be a panacea for novel conditions unanticipated in coaching.

Paradoxically, Sullenberger accurately famous in a latest interview with us, “it requires rather more coaching and expertise, not much less, to fly extremely automated planes.” Pilots will need to have a psychological mannequin of each the plane and its main programs, in addition to how the flight automation works.

Opposite to standard delusion, pilot error isn’t the reason for most accidents. This perception is a manifestation of hindsight bias and the false perception in linear causality. It’s extra correct to say that pilots typically discover themselves in eventualities that overwhelm them. More automation might very effectively imply extra overwhelming eventualities. This can be one cause why the speed of deadly giant industrial airplane crashes per million flights in 2020 was up over 2019.

Pilot coaching at this time tends to be scripted and based mostly on identified and certain eventualities. Sadly, in lots of latest crashes skilled pilots had zero system or simulator coaching for the surprising challenges they encountered. Why can’t designers anticipate the sorts of anomalies that almost took down the Smartlynk aircraft? One downside is that they use out of date fashions created earlier than the arrival of computer systems. This method to anticipate eventualities that may current danger in flight is proscribed. Presently, the one accessible mannequin considering novel conditions like these is System Theoretic Course of Evaluation, created by Nancy Leveson at MIT.

Trendy jet plane developed utilizing basic strategies result in eventualities that await the proper mixture of occasions. In contrast to legacy plane constructed utilizing solely fundamental electrical and mechanical parts, the automation in these trendy jets makes use of a posh sequence of conditions to “determine” the right way to carry out.

In most trendy plane the software program driving how the controls reply behaves otherwise relying on airspeed, if it’s on the bottom, in flight, if the flaps are up, and if the touchdown gear is up. Every mode can carry a special algorithm for the software program and may result in surprising outcomes if the software program isn’t receiving correct info.

A pilot who understands these nuances would possibly, for instance, contemplate avoiding a mode change by not retracting the flaps. Within the case of the MAX crashes, pilots discovered themselves in complicated conditions, i.e., the automation labored completely, simply not as anticipated. The software program was fed unhealthy info.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button