Future Tense

The Crash of the Boeing 737 Max Is a Warning to Drivers, Too

Pilots usually have to understand their autonomous planes. We should understand our autonomous cars.

Side-by-side photos of a flight deck and a vehicle dashboard.
Photo illustration by Slate. Photos by Jon Flobrant/Unsplash and Alex Jumper/Unsplash.

Recently, I asked my colleagues if they had ever been startled by a robotic driving feature. One described the unpleasant sensation of automatic braking in an Infiniti Q50 Hybrid, which would suddenly slow her down in front of a tight turn, bringing her close to getting rear-ended by the New Jersey drivers behind her. Another had been jolted by the aggressive beeping of his Hyundai rental as it warned him not to stray into a nearby lane. The father of a third routinely started his Toyota Camry with the key fob inside the house, drove to work, and found he couldn’t start the car when it was time to drive home. Some 40 percent of American drivers have had a similar experience with what the industry calls “advanced driver assistance systems,” or ADAS, according to a survey by Daniel McGehee, director of the National Advanced Driving Simulator at the University of Iowa.

Advertisement
Advertisement
Advertisement
Advertisement

Airplane pilots, who have been working closely with self-driving software for nearly four decades, have a term for such incidents: automation surprises.

One particularly tragic instance of this may have occurred in October, when a Lion Air 737 Max crashed off the coast of Indonesia, killing 189 people. The crash is still under investigation, but speculation quickly turned to a Boeing navigation software called MCAS that was designed to point the plane downward. Installed as a corrective to the model’s heavy, forward-mounted engines, a malfunctioning MCAS might have directed the plane into a nosedive.

Then, on Sunday, an Ethiopian Airlines 737 Max crashed after takeoff from Addis Ababa, killing everyone on board. Once again, all eyes are on MCAS. On Tuesday, the Dallas Morning News uncovered a series of pilot complaints about the plane’s autopilot function. The European Union, China, and others have grounded all 737 Max planes in response to the crash. Boeing has said it will update the plane’s flight-control software.

Advertisement

Boeing had installed MCAS on all its new-model 737s, which debuted in 2017, but insisted the changes were small enough that pilots need not be re-trained or even alerted to the new software—including the new override controls. The world’s biggest regulatory agencies in the United States and the European Union sided with the company.

Advertisement

The crashes are a reminder that even as commercial flight has become heavily automated, pilots still largely understand the navigation systems that fly the planes and know how and when to override them. (And as the Dallas News scoop shows, when they’re not working.) The 737 Max crashes might be the exceptions that prove the rule. Generally, this man-and-machine symbiosis has made air travel extremely safe: The United States has only had one airline fatality in the past decade.*

Advertisement
Advertisement

But automation has not made pilots’ jobs easier, says Steve Casner, a pilot and research psychologist at NASA’s Ames Research Center: “You’d think it would dumb down the role of the pilot. Contrary to expectation, you have to know more than ever.”

Casner is one of a number of pilots and analysts who see a parallel between the introduction of automation in airplanes more than 30 years ago and its arrival in cars today, as drivers prepare to relinquish the burdens of navigating the blacktop.

“It’s like 1983 all over again,” Casner told me Monday. Where airlines by and large got it right, he thinks car-makers may be overeager in sticking humans in the car with unfamiliar technologies. “I’m very concerned that even though aviation has shown us how to do it, we’re about to make a big mistake with cars. Sitting there waiting like a potted plant for the lights to blink is not one of our fortes.”

Advertisement
Advertisement
Advertisement

Together with the cognitive psychologist Edwin Hutchins, Casner is the author of a new paper, “What Do We Tell the Drivers? Towards Minimum Driver Training Standards for Partially Automated Cars.” One of their main points is that automation would not have made commercial flight as safe as it is today without pilots who understood how the systems worked.

“Regardless of what comes out [of the 737 Max crashes], the human-machine interface is going to be a factor,” says Missy Cummings, a former fighter pilot and head of Duke’s Humans and Autonomy Laboratory (HAL!). She has been worried about that issue with autonomous vehicles for years. Not only do drivers get distracted by self-driving tech, not understand its capabilities, and ignore its alerts; it also gradually erodes their skills, leaving them unprepared to handle regular operations when the time comes. (Pilots using automation largely retained their motor skills, but their decision-making skills eroded.)

Advertisement
Advertisement

“They lull people into a false sense of security, and this is a lesson we’ve learned the hard way in aviation,” Cummings explained. “People basically get their driver’s license because they’re alive. The bar is really low. You can enforce that people go through some kind of standardized training, but we just won’t do it, we’re lazy. That’s why I’m against Level 3 autonomy. You just can’t guarantee that humans—not getting checked by the FAA for an annual check ride—are going to do their job.”

Advertisement

But that gets to a major difference between air and car travel. Flying is safer than walking. Automobile travel remains so dangerous, for both drivers and pedestrians, that the bar for new technologies is considerably lower. So if a feature like push-button ignition has killed two-dozen people through carbon monoxide poisoning (because they leave the car running in the garage), that’s not grounds for a recall. Nor has Tesla been particularly damaged by the deaths of two owners whose cars drove into trucks while the “Autopilot” function was engaged.

Advertisement

Moreover, those self-driving technologies that will be widely introduced in most 2019 models—emergency braking, lane-change alerts, adaptive cruise control—are likely reducing crashes in the aggregate, says McGehee. (Federal regulations don’t require data about advanced driver-assistance tech to be included in crash recordings, so it’s hard to be certain about its role.) It’s a bit Machiavellian, but maybe you have to break a few windshields to make a self-driving utopia.

Most autonomous vehicle scholars say if the end result of AV research is to cut out humans entirely (as Google’s Waymo aims to do), that will be very good for safety. It’s the interim period that will be awkward as humans adjust to a range of technologies. Few are perfect, some require supervision, none require training.

Advertisement
Advertisement
Advertisement
Advertisement

“Robots make excellent backup drivers to humans. Humans make terrible backup drivers to robots,” says Costa Samaras, an AV specialist at Carnegie Mellon. But with thousands of annual deaths from drunk driving and thousands more from unworn seat belts, it’s hard to see educating drivers about autonomous technology becoming a priority.

To some extent, this has always been the automotive model: Chrysler introduced “autopilot” back in 1958, a technology that would soon be advertised to female drivers as “footless driving.” Today we know it as cruise control. Explain what it is—your car goes 70 mph at the touch of a button!—and cruise control sounds like a death trap. In reality, it’s hard to imagine a long highway drive without it. Sixty years later, who the hell knows how it works?

Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society.

Correction, March 13, 2019: This article initially stated that U.S. airlines have not experienced a passenger fatality in more than a decade. Jennifer Riordan was killed by an exploding engine on a Southwest Airlines flight in April 2018.

Advertisement