Joshua Brown, 40, believed in the power of engineering. He was a former Navy SEAL, a technology consultant, and a Tesla fan. He had posted YouTube videos of himself driving a Tesla Model S on autopilot, taking his hands off the wheel to show how the car could avoid a collision on its own. He had nicknamed his car “Tessy.”
On May 7, Brown and Tessy were cruising down a Florida highway on autopilot when they both failed to notice a big-rig truck making a left turn across traffic in front of them. The truck’s driver told the Associated Press that Brown’s car “went so fast through my trailer I didn’t see him.” A quarter of a mile later, what was left of the car and Brown came to an abrupt stop against a telephone pole.
The driver told the AP that, after the crash, he could hear a distinct sound coming from the car, the roof of which had been sheared off by the trailer. The car’s entertainment system, he said, seemed to be blaring a Harry Potter movie.
The driver acknowledged he could only hear the sound of the movie, not see it. And Tesla was quick to point out that its in-dash touchscreen is disabled from playing video, for exactly the reasons you’d guess (although hackers have found ways around that). But a Florida Highway Patrol sergeant told Reuters on Friday that a portable DVD player was found in the car.
This matters, not so much because it suggests that Brown may have been behaving recklessly—that’s a question for the legal system to sort out—but because it seems to corroborate skeptics’ fears about the perils of an autopilot system.
Tesla insists that its car’s ability to accelerate, brake, and steer on its own does not affect the human driver’s responsibility to pay full attention to the road at all times. But it certainly seems to affect people’s will to do so. And it’s fair to ask: If the autopilot system isn’t intended to reduce the driver’s mental load, why is it there at all?
Tesla will tell you it’s a safety feature: a second pair of eyes on the road, a second foot at the brake pedal. But if it means fewer actual human eyes on the road, then it’s going to be a tough sell to regulators unless the technology is nearly flawless.
Tesla says its autopilot system navigated 130 million miles of road before its first fatal accident, which is fewer deaths per mile than traditional cars. Realistically, it’s going to need to go a lot more than 130 million miles before the next death in order to satisfy regulators and the public that autopilot systems are trustworthy. The head of the National Highway Transportation Safety Administration, which is investigating the autopilot’s performance in the collision, has called for such systems to demonstrate that they’re at least twice as safe as human drivers.
In fact, some autopilot systems might already be safer than Tesla’s. Google’s self-driving cars have more sophisticated (and pricier) sensor arrays than the Model S, because they’re designed to be entirely autonomous. But they’re likely to take a PR hit too if Tesla’s system gets a bad rap.
We may never know exactly what Brown was doing, or how much attention he was paying to the road, when that truck turned in front of him. And a lawyer for Brown’s family, Paul Grieco, told me there are no plans to take legal action against Tesla until authorities have completed their investigation. In the meantime, Brown’s family does not seem intent on crusading against Tesla or self-driving cars. In a statement Friday, they said:
In honor of Josh’s life and passion for technological advancement, the Brown family is committed to cooperating in these efforts and hopes that information learned from this tragedy will trigger further innovation which enhances the safety of everyone on the roadways.
That’s a generous sentiment, one that seems to be in keeping with Brown’s own faith in the power of technology. In this tragic instance, however, his faith in one particular technology appears to have been a little more than it deserved.