Future Tense

Why Homeland Security’s Pre-Crime Prevention Technology Is a Terrible Idea

Travelers in the securit line at Los Angeles International Airport.

Photo by Kevork Djansezian/Getty Images

Maybe Minority Report’s pre-cogs can’t be made a reality.

The Department of Homeland Security has been working on a controversial program called Future Attribute Screening Technology, or FAST. In a December 2011 privacy impact assessment (PDF), DHS’s goal for FAST is “to determine whether technology can enable the identification and interpretation of a screened subject’s physiological and behavioral cues or signatures without the need for operator-induced stimuli which, in turn, will allow for security personnel to remotely (and therefore, more safely) identify cues diagnostic of malintent (defined as the intent to cause harm).”


So FAST watch a person’s eye movement, body language, facial expression, etc., and determine what he might be a security risk. More alarming is that the software is also intended to detect heart rate, body temperatures, and other biometric information—without even touching your body. Some of this technology is already available to consumers, even: An iPad app lets you measure heart rate via webcam.


But over on the Atlantic, Alexander Furnas, a master’s candidate at the Oxford Internet Institute, argues that FAST will never work. In part, the problem is that the system is doomed by false positives. “[T]he results may be counter-productive as TSA and DHS staff are forced to divert their attention to weeding through the pile of falsely flagged people, instead of spending their time on more time-tested common-sense screening procedures,” he writes.

The project was field-tested last year and reportedly had a 70 percent success rate, though what “success” means here is a bit unclear. DHS has been largely quiet about it since then, aside from the aforementioned December 2011 report. But that assessment suggests DHS remains invested in FAST.

Read more on the Atlantic.