On Wednesday the National Highway Traffic Safety Administration announced a massive recall of Teslas equipped with Full Self-Driving Beta, the technology that enables vehicles to control some aspects of driving, such as turning and adjusting speed, in urban environments. The FSD package, which currently costs Tesla owners an additional $15,000 when they buy their cars, requires the driver to be watching the road at all times (although Tesla enthusiasts have figured out ways to trick the cars’ attention guardrails for years). The NHTSA recall affects over 360,000 Teslas with FSD, which is pretty much all of them.
Critics have long warned that FSD is dangerous, and the recall’s language suggests they were right. According to NHTSA, FSD “may allow the vehicle to act unsafe around intersections, such as driving straight through an intersection while in a turn-only lane.” That sounds bad, as do other FSD behaviors cited by the federal car-safety agency, including speeding, rolling through stop signs, and running yellow traffic lights “without due caution.”
The recall is voluntary, meaning that it was jointly agreed upon by Tesla and NHTSA. The “remedy” will be a free over-the-air software update for Tesla owners, who will be notified of its availability by April 15.
One can reasonably assume that FSD owners will receive some kind of patch by that date. But how confident can they or the public be that it actually fixes the serious problems NHTSA identified? Missy Cummings, a professor of engineering at George Mason University who recently left a position as a senior advisor at NHTSA, has her doubts.
“Even if you work 24/7 for the next 60 days, I’m not sure there are enough hours to adequately address all the issues NHTSA has raised,” she told me.
Assuming Tesla does claim to have resolved the FSD problems that NHTSA has flagged, Americans may well have to take the company’s word for it. Why is that? Because the U.S. does not require that automated car technology be tested and approved for safety before being offered to the public. That goes for initial systems as well as for over-the-air updates.
Tesla’s big recall should serve as a wake-up call. A recklessly designed “autonomous” system shouldn’t be installed on hundreds of thousands of vehicles before the feds intervene. We can prevent that from happening by requiring that these technologies receive pre-approval before they are sold to the public.
Notably, there is no similar FSD recall in the European Union, because Tesla hasn’t received the green light to offer it there. Until regulators grant that permission, Tesla can’t sell FSD to Europeans. During a speech last year in Berlin, Tesla CEO Elon Musk himself summarized the difference in transatlantic car regulations: “In the U.S. things are legal by default, and in Europe they’re illegal by default.”
In fact, the U.S. does use pre-approval to improve transportation safety—just not for cars. But if a plane manufacturer is designing a new piece of software or hardware, the company must work closely with the Federal Aviation Administration to get the go-ahead prior to deployment. This system, sometimes called type approval or type certification, does not always work perfectly (see the 737 Max), but the impressive safety record of American aviation speaks for itself: The U.S. experiences under 0.1 deaths per billion passenger miles, less than 1/100th the risk of dying in a car crash.
But for autos, the U.S. has basically said to carmakers, “You’re good. We trust you.” Manufacturers place a sticker on each new vehicle stating that it complies with the Federal Motor Vehicle Safety Standards, and they’re all set. Carmakers do generally abide by FMVSS, but—and this is a giant flashing “but”—there is nothing within it pertaining to autonomous driving technology or so-called Advanced Driving Assistance Systems (ADAS) like Tesla Full Self-Driving. As a result, carmakers are free to design and install whatever technology they like, as long as their vehicle conforms with the outdated FMVSS, which was drafted under the assumption that a driver is always handling the car.
Unless and until NHTSA identifies a pattern of failures, which the new recall suggests has happened with FSD, the agency cannot protect Americans from recklessly designed software or ADAS technology. Tesla’s defenders might argue that those who bought FSD accepted its risks, but that cannot be true for others on the road, including drivers, cyclists, pedestrians, and emergency responders.
Requiring pre-approval would fix this problem by forcing carmakers to demonstrate the safety of their new technologies before the public is exposed to them. As intuitive as that might sound, adopting such a standard in the U.S. would be no easy lift. Congress would need to get behind it, granting NHTSA new authorities and expanding its workforce. Such a proposal is all but guaranteed to face ferocious opposition from carmakers fearing that its adoption would add time and expense to their efforts to bring new technologies to market.
But again, pre-approval seems to have worked well for aviation. Are cars really so different?
Cummings, who is a former Navy fighter pilot, thinks that they increasingly are not. “Because cars are on the road every day, we think of them as less complex than planes,” she said. “But cars with autonomy are extremely complex. The amount of code that goes into these systems is phenomenal.”
And so we face a choice. As we enter the brave new world of automated driving, should the U.S. err on the side of protecting road users—including those who never agreed to be guinea pigs as they walked, biked, or drove their car—or should we instead help automakers deploy their newest technologies on public roads as quickly as they can?
As the Tesla FSD recall shows, Congress owes the American people an answer.