This week, the Biden administration confirmed a Reuters report that it plans to appoint Missy Cummings, an engineering professor at Duke University and a former fighter pilot, as the senior adviser for safety at the National Highway Traffic Safety Administration. The head of Duke’s Humans and Autonomy Lab, Cummings is an expert in human factors, a field examining interactions between people and machines. That’s an important skill set for the development of advanced driver-assistance systems, or ADAS, emerging automotive technologies that rely on safe handoffs between car and driver on the roadway. It’s the crucial bridge between the mostly dumb vehicles we drive today and self-driving cars.
Cummings has studied ADAS for years, and she has been a vocal critic of Tesla’s deployment of its Autopilot feature, which enables a vehicle to moderate speed, make turns, and respond to traffic signals on its own (though, contra the feature’s name, the driver must remain vigilant and ready to intervene). Cummings’ research suggested that Autopilot frequently fails to work as intended, and in 2020 she criticized a robotaxi service that Tesla CEO Elon Musk promised, tweeting, “My lab has been running controlled experiments on Tesla Autopilot & I can say with certainty that they are not even close to being ready. My student on this project should get hazardous duty pay.”
Many technologists and automotive experts are cheering a Cummings appointment to NHTSA. But an extremely online community of Tesla fans is furious. A couple of hours after the news broke, Omar Qazi, a Tesla booster with a large online following, tweeted, “If they try and take Autopilot away from us we will riot so hard January 6 will look like a day at Disneyland,” concluding with a laughing emoji. Qazi later deleted the tweet, issuing an apology and claiming it was a joke.
That may be true, but much of the online Tesla community seemed to be having a meltdown (including more than a few people who employed disturbing and misogynistic language). Within hours, a petition on Change.org called on the Biden administration to reconsider Cummings’ appointment, collecting more than 18,000 signatures in two days. Elon Musk himself tweeted, “Objectively, her track record is extremely biased against Tesla,” and then jokingly responded to a fake account created in Cummings’ name. On Thursday evening, after enduring two days of online harassment, Cummings seemingly deleted her Twitter account.
The hyperventilating reaction shouldn’t come as a surprise, given the cultlike loyalty that Tesla has inculcated with its fans, especially those active on social media (who, to be fair, do not reflect all Tesla supporters). In reality, any senior adviser’s ability to set policy is constrained by the rigidities of the Department of Transportation’s org chart as well as the byzantine federal regulatory process. No one should expect a recall of Autopilot anytime soon, even if such steps appear warranted on safety grounds, as I’ve argued previously. (In a nutshell: Autopilot should have stronger driver-monitoring systems, be given a less misleading name, and only be accessible in safe highway environments.)
But could the Biden administration ultimately force Tesla to pull Autopilot or place constraints on its use? That seems increasingly plausible. Five-year-old guidance from NHTSA articulates the agency’s authority to intervene if autonomous driving systems show evidence of “predictable abuse,” a reasonable charge to levy at Tesla given the array of YouTube videos of drivers asleep or playing games in the driver’s seat, despite warnings in Tesla’s manual. Over the summer NHTSA launched an investigation into a pattern of Teslas striking stationary emergency vehicles, and the agency has challenged the automaker to explain why it didn’t issue a recall for a recent software update. Meanwhile, a growing number of fatalities has been tied to Autopilot, including one in California in which a Tesla Model 3 traveling at 60 mph crashed into a pickup truck and killed one of its occupants (the victim’s family has sued the company). Tesla’s defenders often point to the nearly 40,000 annual traffic fatalities in the United States, suggesting that Autopilot is safer than human drivers, but evidence for that claim is lacking.
Pressure is coming from other federal directions as well: The National Transportation Safety Board’s new chair, Jennifer Homendy, has criticized Tesla for letting untrained owners use Full Self-Driving (the company’s ADAS for urban environments) on public roads, while Democratic Sens. Edward Markey and Richard Blumenthal have asked the Federal Trade Commission to investigate Tesla for false advertising (by the company’s own admission, Full Self-Driving doesn’t actually allow the car to be self-driven).
If there was ever any doubt, the online firestorm over Cummings’ appointment shows that a federal crackdown on Tesla will meet fierce opposition—much of it coming from the hundreds of thousands of Americans who own one of the company’s vehicles (all Teslas purchased after October 2016 come equipped with Autopilot). And that pushback is likely to be inevitable, despite ample evidence that Tesla’s deployment of Autopilot and Full Self-Driving is endangering road users.
If NHTSA ultimately forces Tesla to remove or constrain the use of Autopilot, thousands of Tesla owners will rebel against being deprived of something they thought they owned. Even though they never should have had access to that something in the first place.
Qazi may have been joking about storming the Capitol if “they try and take Autopilot from us,” but he was tapping into a powerful psychological truth: People really don’t like losing something they already possess. In a seminal 1990 paper, behavioral economists Daniel Kahneman, Jack Knetsch, and Richard Thaler found that, contrary to traditional economic theory, test subjects placed a higher value on a coffee mug if they were given it at the outset of an experiment instead of having a chance to acquire it later. Kahneman and Thaler went on to receive the Nobel Prize, and the phenomenon they described is now known as the endowment effect.
I myself have watched a hard-charging transportation company exploit the endowment effect to convert customer enthusiasm into political power. In 2011, I was working in the D.C. mayor’s office when Uber brought its ride-hailing service to the city. Rather than seek regulators’ permission to operate, Uber went straight to potential users, throwing parties, distributing free trip vouchers, and quickly building an enthusiastic customer base.
When the expected government crackdown came—in the form of a sting operation mounted by the chair of the D.C. Taxicab Commission—Uber was ready. The company mobilized its fans, telling them that overzealous regulators could rob them of their beloved service. Fans of Uber were directed to send emails, write letters, and call city officials. I received some of these angry complaints, but not as many as one of my colleagues, who came to bemoan “the Uber zombie horde.” The company’s approach didn’t win many friends in City Hall, but it worked: Within a few months D.C. codified ride-hailing’s legality. Similar stories played out in cities nationwide, many of whose leaders would probably like to have a do-over now that ride-hailing’s immense societal costs have become clear.
While Uber strategically cultivated a base of popular support to shield it from regulators, Tesla may or may not have had similar intentions with its permissive deployment of Autopilot (the company might have been seeking to win over investors or appear ahead of competitors).
Regardless of the reason, Tesla has taken a far more lax approach to deploying its ADAS than other carmakers, brushing aside calls from the National Transportation Safety Board to install better driver-monitoring systems and limit Autopilot’s use to the highway environments for which it is designed. By refusing to apply such constraints, Tesla has increased the risk of drivers dangerously misusing Autopilot—but it has also made activating it as easy as pressing a button, giving owners the sense that they possess an unfettered tool.
If NHTSA forces Tesla to limit Autopilot—and regardless of the strength of NHTSA’s claims—many owners will feel they are being deprived of something that was “theirs,” much like Uber customers in D.C. a decade ago. Tesla could harness the resulting anger for political power. The company might direct owners to focus their ire on the Biden administration, demanding a regulatory reversal.
Or, quite possibly, Tesla could turn to Republicans. Musk already seems to be playing footsie with the party, complaining about being left out of a White House event about electric vehicles and referring to Biden as “sleeping,” a nod to one of Donald Trump’s favorite epithets. Texas Gov. Greg Abbott, who signed into law a bounty program on abortion providers, claims that “Elon consistently tells me that he likes the social policies in the state of Texas,” a statement that Musk did not rebut.
However the coming months unfold, four years of lax oversight at the Department of Transportation under Trump have allowed Tesla to distribute Autopilot to hundreds of thousands of owners, despite clear safety risks. The horse has left the farm; the value these owners place on Autopilot is now amplified by the endowment effect. That gives Tesla a unique power compared with other automakers that have taken a more responsible, safety-conscious approach to their ADAS development. Such companies have behaved more ethically, but they’ve received no competitive benefit. Tesla has dangerously exploited the Trump administration’s failure to set guardrails in the fast-evolving ADAS market, and the Biden administration is now forced to clean up the mess.
You don’t have to look far into the past to find an instance where federal regulators brought the hammer down on a leading-edge motor vehicle company. In 2020, an autonomous shuttle from EasyMile stopped unexpectedly in Columbus, Ohio, injuring a passenger. NHTSA responded by temporarily ordering the company to cease operations nationwide. Few people noticed, in part because only a handful had ridden in an autonomous shuttle and almost no one did so habitually, which negates any endowment effect. There was thus a negligible sense of loss when the service ended.
It will be another story entirely if the Biden administration throws the book at Tesla, demanding that the company finally address Autopilot’s safety risks. Many Tesla owners will be up in arms—even if they don’t storm the Capitol. My advice to the public officials finally giving Tesla the scrutiny it deserves: Brace yourselves. And carry on.
Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society.