“Tesla With Autopilot Hits Cop Car—Driver Admits He Was Watching a Movie.” The headline from August was riveting—and easy for readers to dismiss as something that could never happen to them. While an unfortunate few can turn anything you hand them into an implement of disaster, most of us possess the common sense to not do anything so reckless while driving new automation-equipped cars.
At least, we think we do. But shrug off that headline at your own peril. The way we are designing our new car technology is upending some of our most trusted commonsense understandings—the ones that we rely on to keep us safe. It can make actions that seem reasonable and safe be anything but. Our new cars are quietly making it easier for anyone to become a headline.
Let’s start with the driver-assist features that operate the car’s controls while the driver plays the role of supervisor. If you decide to engage those features, the car manual clearly says to keep your hands on the wheel and your eyes on the road. A study conducted at the Insurance Institute for Highway Safety over the course of a month tells us that drivers do indeed follow the advice—at least at first. The drivers paid close attention to the road and watched their cars closely to get a sense of what they could and could not do. It wasn’t until toward the end of the month, after drivers gained familiarity with their cars, that they began to allow their attention to drift, and then only in free-flowing traffic along limited-access highways.
It all sounds reasonable, and that’s where the problems begin.
Drivers know that it’s rare for things to go wrong along uncomplicated stretches of highway, but few realize that these rare events are exactly what our car technology struggles with the most. A pop-up pedestrian or an overturned tractor-trailer in the middle of the road can be surprisingly difficult for even the most sophisticated computer vision system. Artificial intelligence experts understand this, but how do you explain to a billion drivers worldwide that what seems easy to them is hard for a computer, and what seems hard is easy? It is common sense turned on its head. Drivers who want to use driver-assist features have to grok this backward concept and work it into their everyday routines. And evolution and lifelong conditioning are not working in our favor here. When a human central nervous system encounters an open, tree-lined stretch of highway, it doesn’t see danger, it hears a string quartet. How are we going to rewire that reaction in billions of human brains? The sentence in the glovebox reference manual doesn’t seem to be doing the trick.
Next consider the rearview cameras that are now mandatory in new cars. According to another study done at IIHS, drivers use them. But the study revealed something troubling. With rearview cameras now available, drivers do fewer over-the-shoulder glances.
After all, what’s the point of turning your head to try to see what is now clearly displayed in front of you? Here our commonsense understanding once again leads us astray. Back-over crashes seldom amount to the driver backing over a stationary object that can now be easily spotted using a rearview camera. A back-over crash is usually a flow, an unfolding situation that often starts with a kid running from the side of the car and ending up behind it. Few drivers realize that when we turn our heads to see what’s behind us, we are spotting things, even if we aren’t specifically looking for them. The head turn is crucial because it’s our chance to detect that running kid in the camera’s blind spot, before he pops up on the video screen and leaves us little time to react. But how do we get drivers to understand this nuance about visual perception—and that our rearview cameras are only a supplement to the traditional head turn? Once again, there’s a sentence in the manual—but the combination of human driver and camera technology is not giving us the safety gains we anticipated. Another IIHS study found that back-over crashes have declined a humble 17 percent.
You can see shattered commonsense understanding all over the roads. Consider the intelligent platoon-based traffic signals that are being developed to help alleviate our growing traffic congestion problem. Traffic engineers know that the way to keep traffic flowing is for drivers to calmly maintain their position within the platoon of cars that surrounds them. If you’re willing to do that, the smart traffic signals will spot you and wait for you and your entire platoon to pass through the light. There is no need to try to “make the light” because, if you remain calm, the light will try to make you. All you have to do is resist any temptation to step on the gas, because if you do, you’re going to ruin everything. But what will stop our primitive minds from glancing at our wristwatch, stomping on the gas pedal, and wresting defeat from the jaws of victory?
Instructions and warnings have always been our first line of defense against product misuse.
But psychologists know that when we humans are handed something new, we don’t retire to our chambers and read the instructions. We like to get started right away doing useful work. We tend to skip the instructions and use what we already know—our common sense—our everyday understanding of how the world works. For most legacy inventions, our everyday understanding of how things work has served us well. But today these same understandings can lead us into danger.
My colleagues and I have argued the need for driver training: that our new cars are different enough that a short version of “driver’s ed” might be required for those who drive them for the first time. But will this kind of training really work? We humans are highly resistant to ideas that contradict our commonsense understandings. No matter how many experiments tell us that we are bad at “multitasking” or noticing unusual things that pop up, the vast majority of drivers dismiss the science and continue to post big distracted driving crash numbers.
Car designers need to keep in mind the everyday understandings that drivers bring behind the wheel. Science shows that a driver might not notice a truck stopped in the middle of the road while he’s watching a movie behind the wheel. Drivers’ (erroneous) commonsense understanding tells them that they will. Design has to begin with this discrepancy and either work around it or help correct it.
If we can’t convince drivers to keep turning their heads before backing up, maybe rearview camera screens could be relocated to the back of the car interior to help motivate over-the-shoulder glances (e.g., in order to see into your blind spot, you still have to look in the direction of your blind spot). If drivers are inclined to mindlessly stomp on gas pedals, maybe we could implement a chime that sounds inside our cars when passage through an intersection controlled by a platoon-based signal ahead has been assured. Drivers might be slowly conditioned to remain calm and let the system do its thing. We already have features that alert and punish drivers who look away from the road when assisted driving features are engaged, but maybe the semi-automated driving task could be redesigned to be a more shared and active experience for the driver.
Is car technology really making us worse drivers? It is, but it doesn’t have to be this way. Technology isn’t the problem—it’s the way we are designing and using it. We should stop lambasting cars and drivers over what amounts to a problem that we simply haven’t thought about deeply enough yet. Technology promises to make much-needed improvements to the current dangerous and congested state of human driving. But humans and computers will need to meet in the middle. We need to design more intuitive technology and, as individuals, begin to question some of our most closely held beliefs and understandings because as technology advances, change runs deep.
Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society.