Future Tense

Don’t Give Up on Your Digital Privacy Yet

A federal data privacy law is still important. Here’s why.

A collection of clocks with binary code overlaid on top.
Photo illustration by Slate. Photos by Brankospejs/iStock/Getty Images Plus and fotomay/iStock/Getty Images Plus.

Adapted from Terms of Disservice: How Silicon Valley Is Destructive by Design, by Dipayan Ghosh, published by Brookings Institution Press.

Some have become deeply pessimistic about the internet industry’s data collection practices and believe that privacy is gone forever. Indeed, Silicon Valley’s dominant technology platforms have engaged in a mass surveillance operation as a matter of business—a topic of discussion that will no doubt come up when the CEOs of Facebook, Google, Amazon, and Apple testify before the House Judiciary Committee later this month. There seems to be no end to their data harvesting, and there is no incentive for these firms to delete the personal information they have already accumulated. They have developed sophisticated inferences about us that are apparently here to stay because they contribute to the high margins experienced across the sector.

Advertisement
Advertisement
Advertisement
Advertisement

Meanwhile, a seemingly endless number of breaches have occurred: Equifax, Yahoo, Myspace, Target, Facebook, Marriott, Wyndham, the U.S. Office of Personnel Management, the Internal Revenue Service, and others have experienced debilitating unauthorized hacks and exposures of personal information, in most cases to unknown but doubtless nefarious parties. We do not have anything close to recourse; we lack even a federal law that protects us from this kind of activity. And while we have 50 state data-breach-notification laws in the United States, the financial, telecommunications, and technology industries have fought as a unified front to ensure that a meaningful federal data-breach-notification protection that can afford consumers some rights protected by the national government in the face of an invasive private sector will not pass unless it is entirely on their terms.

Advertisement

Given the steady drumbeat of security breaches, some have argued that the degree of data leakage to the private sector and the companies’ failure to protect consumer data is so intense that passing a federal privacy law that affords access, control, and choice back to the consumer would not reduce the dissemination of private data. Amazon’s former chief scientist Andreas Weigend, for instance, has opined: “I have realized that even if you were a privacy zealot, you don’t have a chance. Data are being created as we breathe, as we live, and it is too hard a battle to try to live without creating data. And that is a starting point: that you assume that we do live in a post-privacy economy.” Even the TV show South Park aired an episode in 2014 in which Cartman claims privacy is gone (albeit mostly in reference to drones, which threaten physical privacy).

Advertisement
Advertisement

But this argument ignores an important fact: Your behavioral data are temporally sensitive. The data describing your behaviors today are different from the data that described your behaviors a month ago, and progressively more so than data from a year ago, two years ago, and 10 years ago. Therefore, as behavioral data get older, they become less accurate in describing your true self and, therefore, less valuable to the internet companies. Back when you were in college you might have had entirely different tastes. Those tastes determined your spending habits in the market. It is the real-time, currently relative knowledge of those habits that matters most to companies like Amazon—not information about your past behaviors. If you just watched the end credits of Curb Your Enthusiasm, maybe you would consider buying the entire Blu-ray disc set, or maybe you would be interested in purchasing Larry David’s white sneakers or watching Seinfeld on Hulu or YouTube next. If you were spotted yesterday checking out the health food aisle at Whole Foods and gym shorts at Lululemon, then marketers at Nike, SoulCycle, Equinox, Spotify, and Brooklyn Boulders should automatically be reaching out to you to see if you might be interested in signing up for their related services.

Advertisement
Advertisement
Advertisement
Advertisement

It is less interesting for these companies to know that you were spotted in that aisle a year ago; your interests might have changed, you might have had to take on a more demanding job, or you might have skipped town altogether. As such, your purchasing behaviors in the near future do not reflect your spending a year ago as accurately as your spending yesterday does. This temporal disparity is precisely why information gathered on you recently is given far greater mathematical weight by marketers than older data about your behaviors is. And this is precisely why a firm such as Nielsen, in its current evolution, is essentially the world’s most established data broker, designing technologies that attempt to infer real-time behaviors seamlessly through analysis of consumer actions in the marketplace. There is a whole field in marketing science behind real-time insight development called sentiment analysis, which modern firms practice using advanced machine learning techniques.

Advertisement

This analysis indicates one thing: If we could pass a meaningful privacy law today, it would have an immediate effect on your life. You could opt in to or out of these kinds of data collection programs and the inferences built atop them. And as such, you could have tremendously more power in the face of the digital behemoths overnight.

In order for things to change, we have to accept that we live in a cold new world in which the modus operandi of the industry is to collect as much data as it can. This is largely because the marginal technological cost of collecting it is minimal; we now have television shows that poke fun at the ease of setting up a server in your apartment’s storage room and running a fledgling startup out of it. The asymmetry of knowledge about the practice of data collection, analysis, and monetization, meanwhile, has left consumers well behind the industry. We are, every one of us, economically exploited; the industry’s goal is to enter our mind and move our psychology to the point where our market actions become influenced in their commercial favor. The most striking feature of this terrible new circumstance is that this algorithmic machine has matured organically. It is not the result of a concerted business plan but rather an experimentally and empirically evolved animal trained to identify opportunities for economic arbitrage in the novel industry of manipulated communication.

Advertisement
Advertisement

There might be technical solutions that can yet be developed to overturn this new regime of mind control. There is a burgeoning academic field called privacy engineering that is forging new advances in the technical protection of privacy and security. Some fascinating applications are beginning to emerge out of this field. Privacy-enhancing technologies, fully homomorphic encryption, federated learning, and differential privacy—these and other cryptographic techniques may in combination one day forge a way toward a privacy-aware future in which we can enjoy all the technological functionality we have with privacy-invasive technologies while protecting our individual privacy.

Advertisement
Advertisement

Indeed, these developments represent real mathematical progress; if advanced and eventually implemented, they would change the privacy situation through technology without the need for regulatory reform. But there is a fallacy in industrial incentives. Research, including some I have conducted with collaborators, shows that in the absence of regulation that forces the industry to adopt the privacy-preserving solution, the incentive is for the industry only to continue down its current path.

Advertisement

The problem is that there is no market for privacy because consumers fail to collectively demand it in the United States. Yet as the industry silently moves more aggressively into the world of ubiquitous data collection, processing, and analysis, commercial entities will make decisions that pertain to our lives simply to turn a profit with increasing frequency. If those decisions—over what media we should see, whether we should get a loan to buy a car, what kinds of job postings we should be shown—were made fairly and exactly according to our desires, I doubt we would care that some companies were making money off our data by attempting to infer what we wish to see.

But the fact is that we have become the willing fuel for a corporate machine designed only to cater to shareholders at the expense of citizens. We are experiencing the height of the era of commercialized decision-making. So in light of uninformed consumer markets and the deviousness of an internet industry that inherently exploits citizens’ lack of understanding, we will need to artificially create the market for privacy. There is only one way to accomplish this: informed regulation made possible by citizen engagement calling for such governmental intervention.

Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society.

Advertisement