Telsa’s semi-autonomous driving technology came under renewed scrutiny on Thursday. The Washington Post first reported on an incident from earlier in September in which a Tesla Model S crashed when the owner tried to activate the “Summon” self-parking feature. Within an hour, Jalopnik then reported that an over-the-air update to customers’ Model 3 cars has been disabling the autopilot functions altogether.
North Carolina IT consultant Mangesh Gururaj told the Post that his wife was trying to back out of the garage in the family’s Tesla. They had relied on “Summon” and other autonomous driving features in the past without incident, but this time around their car abruptly veered into the garage’s side wall and lost its front end. Gururaj claims that the car would have kept moving if his wife had not hit the brakes. Tesla declined to provide Gururaj with information about the crash, even though it had retrieved the car logs to investigate. “You are responsible for the operation of your vehicle even during summon mode,” the company told him in an email.
Gururaj says that he will only drive his Model S manually from now on. A group of Model 3 owners discovered this week, though, that they currently don’t even have the option to engage the autonomous features. According to Jalopnik, Telsa had been pushing out an update on Tuesday and Wednesday that was meant to make the autopilot more adept at navigating lane changes, traffic lights, stop signs, and highway ramps. A Tesla owner told Jalopnik that after the update failed, they reached out to a company representative, who said it was a “known issue.” The owner discovered the next day that all the autopilot features were disabled. Several other owners on the Tesla Motors Club forum recounted similar experiences. Tesla has reportedly been telling customers that the issue will be fixed by Friday.
Tesla’s autonomous driving technology has been the center of controversy in the past. In March, a driver died after his Model X SUV crashed into a highway barrier in California while the autopilot was engaged. The SUV caught fire and was hit by two other cars. In 2016, the occupant of a self-driving Model S died after crashing into a tractor-trailer. The National Transportation Safety Board noted that the driver did not have his hands on the wheel, as he was supposed to, but also blamed Tesla’s technology. Tesla is by no means the only company to have run into problems with its self-driving car technology. In March, one of Uber’s autonomous vehicles fatally struck a woman in Arizona while the driver was distracted.
Though autopilot crashes and glitches tend to receive high-profile media attention, Tesla and other self-driving car companies have asserted that the technology will save lives in the long run, because it promises to be more reliable than human drivers. Indeed, there were an estimated 40,100 vehicle deaths last year, most of which did not attract the level of controversy that autonomous car crashes usually do.
However, some argue that Tesla has been overstating the competence of its self-driving technology, which can be dangerous because many of the autonomous features are still in the beta-testing phase. “People get lulled into a false sense of security,” Cathy Chase, president of the Advocates for Highway and Auto Safety, told the Washington Post. “The Tesla approach is risky at best and deadly at worst.”
Tesla’s website claims, “All Tesla vehicles produced in our factory, including Model 3, have the hardware needed for full self-driving capability at a safety level substantially greater than that of a human driver.” And CEO Elon Musk himself boasted that the Summon feature would be able to steer a car across the country to meet its owner by 2018.
One more thing
If you think Slate’s work matters, become a Slate Plus member. You’ll get exclusive members-only content and a suite of great benefits—and you’ll help secure Slate’s future.Join Slate Plus