Two men died near Houston, Texas, on Saturday while riding in a 2019 Tesla Model S that, according to local authorities, was speeding into a turn and ended up going off the road and crashing into a tree. It took first responders four hours and more than 30,000 gallons of water to put out the resulting fire, which kept reigniting; when damaged, the lithium ion batteries in electric cars can cause fires that are very difficult to extinguish because of how they store energy. Authorities reportedly attempted to ask Tesla for advice on how to put out the fire, but it’s unclear whether they ended up getting any help.
Besides the fire, there was something especially disturbing about the crash: No one was in the driver’s seat. One of the men was in the passenger seat and the other in the rear. There has been no official confirmation that they had engaged the car’s “autopilot” feature—something that is beloved by many Tesla owners and that critics of the company say is marketed in unsafe ways—though their wives reportedly told authorities that they had been discussing the feature before leaving for their drive.
Unlike the systems that companies like Google and Uber have been developing and testing for years, Tesla’s autopilot is not completely autonomous, though it does allow the cars to accelerate, steer, and brake by themselves. The company warns drivers that they need to be behind the wheel and paying attention in case the system makes a mistake and they have to intervene. There have nevertheless been a number of serious accidents in which the autopilot feature appeared to be involved. For instance, a man died in 2018 while using the autopilot in California; data from his phone indicates that a video game had been active on the device. The National Highway Traffic Safety Administration also confirmed to the New York Times in March that it is currently investigating the potential role of Tesla’s autopilot feature in 23 crashes. Given that the autopilot feature is not supposed to totally take over for a driver, Teslas have a number of safety mechanisms meant to ensure that its customers are present and alert while in the car. (Meanwhile, critics say that simply calling the feature “autopilot” is a dangerous misnomer, and that it could have other safety measures such as eye-tracking.) The autopilot checks to see that someone is in the driver’s seat, that the seatbelt is on, and that a hand is on the steering wheel. So how could someone be riding in a Tesla without anyone behind the wheel?
While the exact circumstances of the Texas crash still aren’t public, some Tesla users have been known to film themselves riding in their cars without a driver, mainly in a foolish bid for social media clout. One particularly controversial video from last October shows a man filming the empty driver’s seat of his Tesla as it cruises down the highway. A TikTok star also posted a clip of himself in January sleeping in the back of his Tesla with no one driving. Do a search on YouTube or Google and you’ll find lots of other examples. These types of videos do tend to receive harsh backlash from other Tesla owners, who point out that it sets a dangerous example. While it’s not always clear whether deceptive film editing might possibly be creating the illusion that there’s no one even minding the wheel, it’s sometimes possible to glean from the videos how exactly these would-be stuntpeople are bypassing the safety mechanisms. One simple workaround is to simply buckle the seatbelt without fastening anyone behind it.
Another hack involves using a device that tricks the autopilot system into thinking that someone’s hand is on the wheel. Normally, if a Tesla doesn’t sense any pressure being exerted on the steering wheel, the car’s display will start flashing and emit audible warnings. If the driver still doesn’t grab the wheel, the autopilot will disable itself. Multiple companies, however, sell magnetic weights that people can clip onto the wheel and simulate that pressure. The NHTSA has previously issued cease-and-desist orders to companies that make such products, though they’re still easy to find on Amazon.
There also seems to be ways to bypass the sensor that the car uses to ensure that someone is in the driver’s seat. Jalopnik reported that the driver’s seat appears to use a simple on/off sensor instead of a pressure sensor that is continuously checking to see whether there is a weight on the seat, so it could be fairly easy to trick. A video on the YouTube channel Dirty Tesla investigating Tesla’s autopilot while in a parking lot provides evidence for that theory. In the video, you can see an owner engaging autopilot while he’s still in the seat. Once the autopilot is on, however, it seems that it stops checking for the driver’s weight, as he’s able to get out of the seat without disabling it. That, combined with the seatbelt workaround, allows the Tesla to drive on autopilot even after he’s left the car.
Tesla, which dissolved its press shop, has not commented on the incident. On Saturday, before the crash, CEO Elon Musk retweeted a company safety report and boasted, “Tesla with Autopilot engaged now approaching 10 times lower chance of accident than average vehicle.” The NHTSA and the National Transportation Safety Board said on Monday that they are sending investigators to the Houston crash site, in part to determine whether autopilot had been engaged.
I’ve been writing about technology at Slate for almost four years now, most recently covering the pandemic’s impact on the labor that fuels the industry. Thanks to the support of Slate Plus members, I’ve been able to document how the coronavirus has galvanized the workforce at Big Tech companies to seek better protections and employment benefits by forming unions. Subscribing to Slate Plus ensures that I can continue to cover this critical moment in labor rights. —Aaron Mak, staff writer