Future Tense

Companies Don’t Really Want You to Read Their Terms of Service

As the uproar over Unroll.me shows, being opaque is part of their business model.

A man uses his smartphone as he crosses a street during a winter storm in New York on March 5, 2015.
Companies are well-aware, and very likely count on the fact, that their customers don’t know how their data is being used.

Jewel Samad/AFP/Getty Images

Every time we install a new app or use a new web service, we make a decision about what kind of data we’re willing to provide in return. And companies have important decisions to make, too: They have to determine how clear they will be when they inform you about the ways they’ll use your data to make money.

On April 23, a story in the New York Times about the founder of Uber broke another story in passing. According to the Times, Uber purchased data about its competitor, Lyft, that was obtained via the web service Unroll.me. Unroll.me solves a pretty common problem: being subscribed to a bunch of email lists that you’ve completely forgotten about. When you use Unroll.me, it crawls through your email looking for those pesky subscriptions, gives you a list of them, and lets you unsubscribe from each at the click of a button. It’s a useful service. It’s also free.

As the saying goes, if you aren’t paying for a product, you are the product. This trade-off most often takes the form of monetization through advertisement. Companies like Facebook, Google, and Amazon want to know as much about you as possible so that they can serve you increasingly targeted advertisements. This is their business model, and most people have a basic understanding now of how companies with free products make money.

However, Unroll.me sold user data for a different purpose. The client was Uber, which (particularly in light of a string of bad press) had reason to want to keep tabs on its major competitor. According to the New York Times, Unroll.me found Lyft receipts in their users’ email and then sold the anonymized information to Uber as a way to evaluate Lyft’s business health.

The backlash has been harsh. Critics called Unroll.me’s move “creepy,” “sneaky,” and “dishonest,” and many users vowed to delete their accounts. The CEO apologized, to little avail. But as Perri Chase, a co-founder of Unroll.me who is no longer with the company, pointed out in an impassioned defense, the company was perfectly within its rights to do what it did. The warning was right there in its privacy policy, she noted, which reads in part: “We may collect, use, transfer, sell and disclose non-personal information for any purpose.” Hadn’t users read the terms of service?, Chase asked. “You opt in for an awesome free product that clearly states the following and you are offended and surprised? Really?”

The CEO’s apology comes down to we’re sorry you’re upset, noting that “while we try our best to be open about our business model, recent customer feedback tells me we weren’t explicit enough.” He points out that users agree they have read and understand their policies before they use the site—but concedes that most people probably don’t have the time to thoroughly review them.

He’s right. Research has shown that it would take the average internet user about 200 to 300 hours to read the privacy policies of every website they visit—and this isn’t even accounting for how often sites update them. This potential time spent reading represents a national opportunity cost of hundreds of billions of dollars.

Unroll.me’s terms of privacy service and its privacy policy are pretty typical; they are long, collectively nearly 5,000 words, and difficult to understand. My analysis found they were written at a reading level of a high-school senior. (It could be worse—in some of my research on TOS, I found an average reading level of 14.8, or a college sophomore, across 30 different websites. But considering that the average reading level in the U.S. is eighth grade, it’s not great, either.)

I don’t believe Unroll.me did something intentionally shady. I think it followed the crowd. However, I also think that it misjudged where an appropriate line would be. Though companies like Facebook often require that users grant “transferrable” licenses in content or data, it makes far more sense for them to keep that data for themselves—to aggregate it, categorize it, and sell ads that target certain kinds of people. So though most people now have a decent model of how companies use their data to make money through advertising, having data sold directly to third parties comes as more of a surprise. However, as this situation with Unroll.me demonstrates, this happens as well.

Meanwhile, companies are well-aware—and very likely count on the fact—that their customers don’t know how their data is being used. If they splashed WE SELL YOUR DATA all over their webpage instead of hiding it in fine print in a privacy policy, they would lose users. Being opaque is part of their business model, and being highly ethical when it comes to disclosure is bad for business. After all, when the status quo is for users to not care (or not think about) data privacy until something hits the news, it really does make sense for businesses to operate this way. The downside, of course, is the potential PR disaster. But for every company that gets “caught” like Unroll.me, how many are engaging in the exact same practices without their users being aware?

People claim to care about data privacy. The outcry in reaction to this situation reinforces that. But in general, we mostly don’t seem to care enough to read the terms of service, to try to find out how our data is being used, or to refuse to sign up for useful apps that in the fine print require blanket permissions. We’re happier just checking the box and hoping that companies will avoid exploiting the permissions we’ve given in any way that would upset us or embarrass them.

The solution is surely not to start spending 300 hours every year reading privacy policies. But it may well be to accept the idea that selling data is part of how the free app economy works—and then to push companies to disclose their data practices in a clear way, outside of the fine print. We need to dial down the hysteria about data use (Facebook manipulates your news feeds! Instagram can serve ads with your content!), which only encourages companies to try even harder not to get “caught.”

Instead, we need to ask them to explain what they are doing and why, and how it is essential for their business model. If we want the services we use to be thoughtful about how they use our data, we should be thoughtful about how they use it, too.

This article is part of Future Tense, a collaboration among Arizona State University, New America, and Slate. Future Tense explores the ways emerging technologies affect society, policy, and culture. To read more, follow us on Twitter and sign up for our weekly newsletter.