Consider, for a moment, the following list: Republican. Abortion. Democrat. Future. Afghanistan. Health care. Same-sex marriage.
There is an enormous amount of information reflected in the way you just read that list. Did your eyes pause for a fraction of a second on certain words? Did your pupils dilate, ever so slightly, at any point while you were reading the list? For some words did your eyes blink at a different rate? Did you backtrack to reread any words, and if so, which ones, when, and for how long?
Eye-tracking, which uses images from one or more cameras to capture changes in the movements and structure of our eyes, can measure all of these things with pinpoint accuracy. There are many benevolent applications for eye-tracking, most notably in providing disabled people with a way to interact with objects on a screen. But recent advances are taking the technology into the mainstream, with the biggest initial applications likely to be in user interfaces and gaming. Apple, for example, has filed a patent application for a three-dimensional, eye-tracking user interface, and European company Sensye aims to have its eye-tracking software built into smartphones next year. As eye-tracking becomes increasingly deployed in laptops, tablets, and smartphones in the coming years, it will open a new front in the fractious digital privacy debate.
Today, eye-tracking isn’t quite ready for mass-market adoption. The computations required tax even the advanced computer chips found in many current-generation consumer devices. In addition, not all of these devices have “front-facing” cameras that can capture images of a user’s eyes. But these obstacles are vanishing as mobile phones and other devices become increasingly powerful. While today’s laptops and tablets might have trouble performing eye-tracking computations, those of 2015 will be able to do so with ease. And, they will almost all have front-facing cameras.
Once the technology for eye-tracking is in place, it will glean information conveying not only what we read online, but also how we read it. Did our eyes linger for a few seconds on an advertisement that, in the end, we decided not to click on? How do our eyes move as they take in the contents of a page? Are there certain words, phrases, or topics that we appear to prefer or avoid? In the future, will we be served online ads based not only on what we’ve shopped for, but also on the thoughts reflected in our eye movements?
This information will be collected, analyzed and resold to hundreds of companies—advertisers, data analytics providers, and others—across the digital ecosystem in what the industry calls the “mobile marketing value chain.” In theory, they will be anonymous, “nonpersonal” data. But in practice, the anonymity will be easy to penetrate. For example, eye-tracking data collected from tablets and smartphones will be tied to a “unique device identifier” associated with one specific device. These data will also be correlated to accurate location-tracking information, often to the precision of a specific home or commercial building.
If we have learned anything from the steady drumbeat of revelations about data collected without our consent—think Carrier IQ in November, Android in December, and Google, Twitter, Apple, and Android last month—it is that these stories tend to follow a predictable pattern: After a few days of headlines, calls for congressional or FTC investigations, and damage-control statements from company representatives, attention shifts elsewhere. For each data collection leak that gets identified and plugged, there are probably dozens more waiting to be discovered. It is an environment in which asking forgiveness, not permission, has proven to be a highly successful business strategy.
The overwhelming majority of the time, no one will be interested in putting all of this information together. But if someone does want to identify us by name, study our eye movements, and try to gauge what we, as individuals, were thinking as we viewed digital content, all of the necessary data will be readily available.
We also have to recognize the law-enforcement and security applications for eye-tracking. Researchers in the United States and the United Kingdom have mapped the correlation between blink rates, pupil dilation, and deception. The Department of Homeland Security has been developing a “pre-crime” program aimed at identifying criminals before they act. The DHS program, known as Future Attribute Screening Technology, is designed to analyze images acquired at airport security checkpoints to measure eye movement, position, and gaze (as well as heart rate, respiration, and facial expression) to identify behavior deemed suspicious.
Of course, it’s tempting to think that there’s a very low-tech solution to unwanted eye-tracking performed by our personal electronics devices: put a piece of masking tape over the camera. For today’s devices, that would do the trick. But that may not be an option in the future. As evidenced by an Apple patent application, future display screens could include thousands of tiny imaging sensors built into the screen itself.
Today, when we read something online, our thoughts are still our own. We should enjoy it while it lasts.
This article arises from Future Tense, a collaboration among Arizona State University, the New America Foundation, and Slate. Future Tense explores the ways emerging technologies affect society, policy, and culture. To read more, visit the Future Tense blog and the Future Tense home page. You can also follow us on Twitter.