Armed police couldn’t stop the shooters in Buffalo and in Uvalde. But perhaps a very small drone equipped with a Taser could. Specifically, Axon CEO Rick Smith said in a Thursday announcement, “non-lethal drones capable of incapacitating an active shooter in less than 60 seconds” (or so the press release goes), which would be stationed inside of schools. At the push of a panic button, a trained human pilot at a control center elsewhere in the country would launch a drone. With the help of a network of security cameras, they would try to target the drone’s onboard Taser probes into the shooter’s flesh, in the hope of keeping them down until police could arrive on the scene.
Smith’s curious proposal met near-instantaneous backlash. Axon’s A.I. ethics board voiced public opposition to the move, claiming in a statement tweeted out by Axon’s official account that the company had decided to announce the Taser-drones-in-schools proposal without consulting the board in advance. (A few weeks before, it had voted that Axon not go forward with a different, considerably more narrow Taser-drone use case for police.) In an interview Thursday afternoon, Smith told me he wanted to spark public discussion about the idea. Perhaps he got more of it than he bargained for. By Sunday, Smith had announced that the company was “pausing work” on the project and would be “refocusing to further engage with key constituencies.” By Monday, nine members of Axon’s A.I. ethics board had resigned, writing in a statement that the school Taser-drone announcement had led them to conclude “that after several years of work, the company has fundamentally failed to embrace the values that we have tried to instill.”
So for now, Smith’s vision of a near-future of Taser drones nonlethally zapping school shooters has faded. But what might prompt a police tech CEO to float the idea of flying Tasers in the classroom in the first place?
Smith and Axon have a long and storied history in the world of police technology. Smith sold the first Taser to Florida police in 1998; it’s now a ubiquitous tool in police departments around the world. By 2009, his publicly traded company had also moved into the police body camera market and swiftly became the biggest supplier of the devices in the country. In 2017, that shift prompted the company to change its name from Taser International to the more ambiguous-sounding Axon. In 2018, Axon announced a collaboration with Chinese drone-making giant DJI, through which Axon would sell DJI’s drones to U.S. police departments, configured in such a way that they’d work well with Axon’s data services. The company inked a similar deal with U.S.-based drone-maker Skydio in 2021.
Perhaps it was only fitting that Smith would want to combine three of his core products into one Taser-drone package: a single object that would both make money and represent his dreams of a world where nobody ever gets killed. And while the school-shooting Taser drone announcement was new, Smith claims he had the general idea for it years ago, not long after the horrifying elementary school shootings at Sandy Hook. (Axon isn’t even the first company to throw around the idea of putting a Taser on a drone, as illustrated by a 2018 video from the startup Brinc in which a drone with a Taser-like device attacks a Latino man at the Mexico border.)
Smith is a loud proponent of using nonlethal technological solutions to counter threats to police, to schools, to governments. He outlines these views in detail in his 2019 self-published graphic novel, The End of Killing, which combines his philosophical take on the matter with two hypothetical scenarios in which Taser drones feature. According to the graphic novel, “killing is a technology problem,” and Axon-made devices like Tasers and Tasers that fit on drones are the solution.
In one graphic novel scenario, an improbably buff workplace shooter is (literally) shocked into submission by a miniature Taser drone that emerges from a smoke detector–like nest on the ceiling. In a second and even more fanciful scenario, set in Syria in 2045, an ISIS-like masked man with a scimitar is prevented from executing an innocent man by a fleet of small Taser-equipped drones, which U.S. intelligence services had tasked with watching the area. After a human approves the action, the drone tases the swordsman into submission. Then a “human transport drone” equipped with a large grabbing arm scoops him up and flies away with him. At the detention facility where he’s deposited, he’s fitted with a mind-scanning helmet that literally reads his memories, looking for incriminating information. “Thousands of people will be processed in the same manner,” Smith confidently asserts. “Some of them will be determined to have done nothing wrong; they will be released and given $10,000 along with a sincere apology.”
To me, Taser drones, mind-reading helmets, and robotic grabber arms that yoink criminals into the sky are solutions to violence that sound just about as horrifying and subject to massive abuse as the solutions we have today. From Smith’s point of view, Taser drones, robotic grabber-arms, and Clockwork Orange–esque mind-reading helmets beat the alternatives of waterboarding and shooting people. I’ve been covering techno-utopian solutions to complex and thorny human problems for a while now, so this gap in perspective both depresses me and feels awfully familiar. Smith’s sensibilities help explain why he feels that the risk of rolling out Taser drones in schools outweighs the potential ethical and practical risks. Of which there are many.
While Axon has pitched the Taser drone as a solution for stopping school shootings, it seems apparent that such a system wouldn’t only be used for cases with a suspected mass shooter. If an expensive and high-tech security tool exists, it’s likely that it’ll get used even for situations that don’t rise to the level of a suspect school shooter prowling the hallways. “Once people invest in these things, they only double and triple down,” says technology ethics expert Chris Gilliard. “They never say, ‘Hey, this wasn’t a good idea.’ ”
And we know massive inequities exist when it comes to who gets punished for what in American schools. Per recent ACLU data, in schools with police, Black students are arrested anywhere from three to eight times the rate that white students do, while disabled students are arrested from between 2.9 to 10 times more often than their nondisabled counterparts. Schools that are poorer and less white are also considerably more likely to have a police presence (and, one might suspect, Taser drones) in the first place.
Tasers also are already being used against children. Consider a 2017 case where a 7-year-old with special needs was tased and handcuffed by Dallas Independent School District Police for “banging his head against a wall in class.” If the Taser drones do get developed, it seems well-nigh inevitable that they’ll be deployed more often against poor kids, nonwhite kids, and kids with disabilities. But they won’t stop there. “Once built, one can imagine these drones will not be limited to special use cases of mass shootings but will be deployed in a whole range of situations (including perhaps protests against police brutality),” law and police technology expert Andrew Ferguson told me in an email.
When I asked Smith about these equity issues, he reiterated that the drones would be flown by professional pilots in 24-hour command centers (which sound quite similar to the ground control centers used by U.S. military drones) who would only spring into action when alerted by an app controlled by people in a school. Smith believes this would create “100 percent accountability” with clear records of who did what and why. Since the drone pilots wouldn’t physically be in danger, or even be in the room, they’d be able to make calmer decisions about the use of force than would armed police on the scene.
Perhaps. But in emotional and chaotic situations, from in-classroom brawls to mass-shooter alerts, it’s hard to imagine that there will be much time for calm debate in the seconds between when a pilot gets an alert and when they’re thrown into the process of trying to zap an alleged miscreant with Taser probes from a tiny flying helicopter. Nor do we know whether better documentation would actually reduce the risk of Taser drones being used in racist or unfair ways. The impact of body cameras on police accountability and racial bias remains decidedly mixed.
Other, technical, practicalities of how the hypothetical Taser drone would work remain mostly unsettled. A classroom Taser drone connected to a network of school cameras would be an immensely attractive target for hackers. While Smith says the system would use “focused centralized gating functions” to protect itself, he claims that these results would be less deadly than if respondents used guns, amounting more to “mischief” than to tragedy. (It is worth noting here that people do die after being hit with Tasers—including at least 49 deaths in 2018 alone—and a disproportionate number of the dead were Black.)
Furthermore, while the shooters in the graphic novel all happen to have conveniently exposed patches of skin for the Taser probes to hook into, Smith admits that in the real world, “clothing penetration is historically our biggest nemesis,” only working about 70 to 80 percent of the time. The remote human pilot of the hypothetical Taser drone might use an A.I. tool to better target the probes, although Smith hastens to note that it wouldn’t require anything as sophisticated as facial recognition—and he said that in tests of the device, the Taser probes, when fired off a drone, had “very little recoil.”
Since drones can’t go through walls, and since it would be terrifically expensive to install a Taser drone in every lockable room, Smith imagines school buildings that might include “small portals, effectively a slot in the wall or on top of the door.” That made me picture a few tiny drones regularly patrolling the classrooms on a controlled circuit, passing through holes in the walls, buzzing over the heads of little kids learning what the color blue is. I also imagined the potential psychological impact of being reminded on a regular basis that your classroom contains, at all times, a flying robotic Taser that can shock you into submission.
“So it’s basically like that Hunter-seeker from Dune,” I told Smith when I spoke to him on the phone on Thursday, recalling the miniscule yet terrifying mosquito-robot (controlled manually by a hidden pilot) that almost assassinated Paul Atreides in his bedroom in both the novel and the film. Smith laughed for a second, a bit awkwardly. “Our minds are running endlessly on dystopian sci-fi, but there’s nothing more dystopian than the real world,” he said. “I have 12-year-old twins, and watching what’s happening in Uvalde. … I can’t fathom what parents went through,” he said.
Uvalde. That’s why Smith announced the school-shooting Taser drone with so much haste. While his A.I. ethics board had previously voted down the idea of putting a Taser on a police drone just weeks before, in the immediate aftermath of Uvalde, Smith says he called the board and told it that the world had changed. “I held my wife in my arms, and we cried about ‘What if that was our kids?’ And we did some soul-searching, and I decided that I’m going to go public that I’m working on this,” he said. (In a Reddit Ask Me Anything on Friday, he brought up his children again, recalling when his then–7-year-old came home crying after an active shooter training in her classroom, where she was instructed to go hide in the back of the room among the backpacks.) Smith was feeling the same raw fear and frustration that so many Americans feel. And as is so often the case with people who work in and build tech, he wanted to apply the tools he happened to have at his fingertips to the situation.
Since there is one regulatory tool out there, an obvious one, that America just won’t use.
But the Taser drone is “completely independent” of gun control, Smith told me. “I don’t think legislators should say ‘Axon will fix this.’ … I don’t want us to be an excuse for inaction.” Smith was picking up on the crux of the problem with technical solutions to mass shooting in America. They’re Band-Aids over massive social problems, attempts to throw more technical gadgets (all of which happen to cost a lot of money, so they’re profitable for the people who make them) and more personnel at a problem that feels politically intractable. “This is a distraction from the issue,” NYU law professor and former Axon A.I ethics board member Barry Friedman said of the Taser drone project. “It plays into the narrative that we can fix every social problem with the police, and that’s a harmful narrative that keeps us from addressing the social problems that we have.”
“I wouldn’t be proposing this in the U.K., or Japan, or pretty much any other country,” said Smith, acknowledging the uniqueness of America’s mass shooting problem. “But ‘something has gotta change’ comes into consideration.” Nor does Smith feel confident that American lawmakers will solve the problem, he told me, citing how gun fatalities have only continued to rise since the Gun Control Act of 1968. (While this is technically true, that figure doesn’t take into account the massive growth in the U.S. population since then. The per capita rate of gun deaths in 2020 was still well below the peak we hit in 1974.)
While Smith was trying not to express his personal opinion on gun control to me directly, his perspective seemed to be: Gun control laws are unlikely to happen, and even if they do, they probably won’t help much. Which leads to a devastating conclusion. In 2022, a future where we stick Taser drones in every classroom feels much more attainable than one where the U.S. passes effective gun control laws.
Smith’s particular Taser drone scheme may be dead (for now), but it will not be the last such absurd-sounding technological proposal that U.S. businesses offer up for protecting schools from shooters. Parents scrabble together cash to buy their kids bulletproof backpacks and hoodies. Big school-safety-tech vendors offer up a $400,000 hallway smoke gun (to confuse the shooter), while A.I. tech peddlers promote cameras equipped with automated “gun detection” systems and recognition systems for tracking student movements. Like Smith’s drone boondoggle, all are likely motivated by some combination of genuine sentiment and interest in making a profit. They’re efforts that, if implemented, will turn American schools—especially the schools that poorer kids tend to attend—into ever more grim and frightening places, where they will be subject to constant surveillance and the risk of arrest as they try to get an education.
Not one of these technical marvels will stop an angry person from legally purchasing a gun.