Today, Politico Magazine published a tell-all expose of the Transportation Security Administration by a former TSA officer. The article ridicules airport security, depicting scanners as futile and humiliating. But if you read between the lines, you can learn useful lessons about how to craft and deploy technology that respects privacy and protects us from the worst in human nature. Surveillance technology isn’t evil. It just has to be well-designed. In fact, what we need is more of it, not less.
The author portrays TSA as a bureaucracy of fools, bullies, and lechers. Some of his aspersions are demonstrably unfair. He derides the agency for making him “confiscate nail clippers,” neglecting to mention that TSA has tried to shed its rules against sharp objects, even to the point of allowing knives on planes—and that it has to enforce these rules only because Congress insists. He also insinuates that the amount of radiation emitted by the scanners is dangerous. He says TSA told its employees “we would just have to take their word for it” that the levels were acceptable. Again, he omits the rest of the story: TSA has provided peer-reviewed evidence that the radiation levels are safe.
The one former officer who clearly comes across as reckless, cynical, and prurient is the author. He offers no evidence for his insinuations about the radiation. He tells puerile jokes, revels in tales of ogling, explains how to sneak a gun through the scanners, and mocks any TSA employee who “believes his or her job is a matter of national security.”
It’s tempting to dismiss the whole article as untrustworthy. But that would be a mistake. The fact that TSA employed such a person for five years—like the fact that the National Security Agency made its files accessible to Edward Snowden, whom it now depicts as a cunning schemer—underscores the importance of designing systems to protect us from such people. That’s the operating principle that has kept this country free for 238 years: Men aren’t angels, and our government has to be organized accordingly, to thwart the worst in human nature. In the case of airport security, this TSA memoir shows that our surveillance systems already incorporate some features that impede abuse—and that they need more.
The author jokes about slang terms used by fellow male officers to describe attractive women. He describes an “Image Operator” (I.O.) room where officers, staring at scanner images on screens, saw passengers’ most intimate contours:
Many of the images we gawked at were of overweight people, their every fold and dimple on full awful display. Piercings of every kind were visible. Women who’d had mastectomies were easy to discern—their chests showed up on our screens as dull, pixelated regions.
But what’s just as notable (though glossed over because it doesn’t serve the author’s argument) is how this recipe for abuse—prurient viewers, revealing images—was disrupted by the design of the technology. Notice what the author doesn’t describe: the passenger’s face. That’s because the system was engineered to blur this part of the body. The officer who sees you on the monitor never sees you in the flesh, and the officer who sees you in the flesh never sees you on the monitor. So the one who sees you naked never knows who you are, and the one who knows who you are never sees you naked.
If you don’t believe it, check out the scanner images posted with the article. The passenger’s head looks like a bowling pin dimly visible through fog. The author reports, “One of us in the I.O. room would occasionally identify a passenger as female, only to have the officers out on the checkpoint floor radio back that it was actually a man.” He tells this story to make the system look sordid. But what it really shows is how blind the image analysts were to the passenger’s identity.
The author also complains that “the I.O. room at O’Hare had a bank of monitors, each with a disabled keyboard.” He ridicules the disabled keyboard, saying it “perfectly summed up my relationship with the TSA,” and he jokes that to relieve his boredom, “I phantom-typed passages on the dumb keys: Shakespeare and Nabokov and Baudelaire.” But what’s dumb, if not dishonest, is the author’s failure to note that there’s a good reason to disable such keyboards: to fulfill TSA’s pledge that the viewing officer “cannot store, print, transmit or save the image.”
The author does, however, identify one loophole in the abuse-prevention system. The I.O. room, he reports,
was the one place in the airport free of surveillance cameras, since the TSA had assured the public that no nude images of passengers would be stored on any recording device, closed circuit cameras included. …
All the old, crass stereotypes about race and genitalia size thrived on our secure government radio channels. There were other types of bad behavior in the I.O. room—I personally witnessed quite a bit of fooling around, in every sense of the phrase. Officers who were dating often conspired to get assigned to the I.O. room at the same time, where they analyzed the nude images with one eye apiece, at best.
In other words, the place in the airport where abuse was most likely to happen was the one place that wasn’t under surveillance. TSA, in the name of privacy, had failed to surveil itself. The key to preventing abuse isn’t less monitoring, but more.
If you don’t believe it, read the author’s account of an incident four years ago:
In 2009, one of my friends had run her male colleague through a carry-on X-Ray machine. (It was a slow night.) When management happened upon video footage of the episode, they were both fired.
Good. The system worked. Now let’s put those cameras in the I.O. room. They don’t need to face the monitors. They need to face the officers. To protect the public, the police must be policed, and the watchers must be watched.