This piece is excerpted from Antisocial Media: How Facebook Disconnects Us and Undermines Democracy by Siva Vaidhyanathan, published by Oxford University Press.
Back in 2008 I was very careful with Facebook. I saw its hazards—and its benefits—as chiefly social. I established (and maintain to this day) a firm rule against interacting with current students on Facebook. I knew that by friending someone, I could peer into aspects of that person’s life to which I was not invited. And in those early days of social media, few people had developed norms and habits to guide their use of the service. I did not want to know whom my students were dating, what they wore to a Halloween party, or—most of all—what they thought of my courses. I was also pretty careful about what information I posted about myself.
I assumed, correctly at the time, that all Facebook could know about me was what I chose to reveal to and through Facebook. My age, relationship status, and sexual orientation fields remained blank during the first two years of my active Facebook participation. Then one day I thought it was time to come out as a straight married man over 40 years old. Once I clicked “married,” a strange thing happened. The advertising spaces on my Facebook page filled up with advertisements for services that invited me to contact women for the purpose of having an affair. Suspicious, I removed “married” from my profile. The ads disappeared. In the early days of Facebook, its data collection and advertising targeting were so clumsy that the site merely filtered for one or two attributes and pushed ads based on them. In addition, the companies that chose to advertise on Facebook in the early days were often unseemly.
That all began to change around 2010. Like for many, my comfort level with Facebook had increased. I had succumbed to the constant prodding and suggestion that I add more friends. Though I still scrupulously avoided current students, my circles grew. More of my social and political activities moved to Facebook. To be without Facebook by 2010 was to miss out on what seemed to be essential conversations and events. Even my parents signed up for it. Facebook’s user base spread to all walks of life and started touching more countries and more languages, leaving would-be and once-dominant competitors such as Myspace with no sense of mission and no way to make money.
Beyond users’ lives and habits, something more important was going on inside the company. Mark Zuckerberg had lured Sheryl Sandberg away from Google in 2008 to be chief operating officer in charge of the business side of the company. By 2010 Sandberg had built an effective data collection and advertising system. The ads on my page began to reflect my professional interests and social connections. One regular ad was for a heavy and expensive leather briefcase like the kind professors in movies carry. It was not a perfect match for my interests (or for professors in general, few of whom would spring for a $250 leather case). But it was far better than ads urging me to cheat on my spouse just because I have one. To accomplish the mission of targeting advertisements deftly, Sandberg needed more and better data about what users did, thought, and wanted to buy. So she embarked on a series of expansions of Facebook’s capabilities to track and profile users. Not coincidentally, 2010 was the first year that Facebook posted a profit. It’s safe to say that if not for Sandberg and her formidable vision and management skills, Facebook might be a broke and trivial company today.
Facebook is the most pervasive surveillance system in the history of the world. More than 2 billion people and millions of organizations, companies, and political movements offer up detailed accounts of passions, preferences, predilections, and plans to one commercial service. In addition, Facebook tracks all of the connections and interactions among these people and groups, predicting future connections and guiding future interactions. It even compiles contact information on those who do not have a Facebook account.
Facebook exposes us to three major forms of surveillance. We might think of them as three perches or viewpoints. Commercial and political entities are able to exploit the targeting and predictive power of Facebook through its advertising system. Through what we reveal on our profiles, other Facebook users can watch and track us as we build or break relationships with others, move around, recommend and comment on various posts, and express our opinions and preferences. And governments use Facebook to spy on citizens or anyone they consider suspicious, either by establishing Facebook accounts that appear to be those of friends or allies or by breaking through Facebook security to gather data directly.
Facebook itself conducts commercial surveillance of its users on behalf of its advertising clients. Facebook has no incentive to offer any third-party access to the data that it uses to drive user-generated posts and direct advertisements. The commercial value of Facebook lies in its complete control of this priceless account of human behavior. But the interface that Facebook provides to both advertisers and those who run Facebook pages allows them to learn significant amounts about their audiences in general and track the level of response their posts and advertisements generate. To profile users for precise targeting, Facebook uses much of the data that users offer: biographical data, records of interactions with others, the text of their posts, location (through Facebook apps on mobile phones equipped with GPS features), and the “social graph”—a map of the relationships among items on Facebook (photos, videos, news stories, advertisements, groups, pages, and the profiles of its 2.2 billion users).
This combination of information allows Facebook to predict user interest and behavior based on what other people with similar attributes and similar connections want, think, or do. Beyond the data that Facebook gathers from its own core services (Facebook, Messenger, Instagram, WhatsApp, etc.), it allows other firms to connect to Facebook directly through a service called Open Graph. Open Graph is how the music service Spotify interacts with Facebook, using Facebook usernames and passwords to enroll and log in to the service. This makes Spotify “social,” in the sense that the music one user listens to via Spotify becomes available to her friends who are also using Spotify, and those friends’ music habits are available to others as well.
This creates a mesh of interests that can prompt discovery or recommendations among like-minded music fans. To Spotify, this service amplifies its ability to find new users and maintain established users. To Facebook, it means that more interactions—even outside of Facebook—become part of the larger social graph and thus useful for profiling and targeting. Facebook, through its Open Graph partnerships and the use of tracking cookies that it implants in users’ web browsers, is able to gather immense amounts of personal data from people who hardly ever log in to their Facebook accounts. Basically, there is no way to opt out fully from Facebook’s ability to track you.
This form of single-firm commercial surveillance seems almost harmless by itself. Facebook lacks a police force, so it can’t abuse its power in a way that injures people or denies liberty or property. If it profiles someone inaccurately and targets advertisements improperly, the company just will not generate revenue for that action. When all those data serve Facebook well, leaders of Facebook argue, it provides a more enjoyable and relevant experience to users. No cat owner wants to see a barrage of ads for dog food. No vegetarian wants to see ads for hamburgers. And we generally prefer seeing posts from the people whom we like and think like.
There are problems with this sort of filtering. But none of those problems quite qualifies as an immediate risk or danger to users. However, Facebook gathers and deploys much of this information without our knowledge or consent. Facebook does not offer us a full view of how our activities are used. And Facebook does not offer us clear and easy ways to exempt ourselves from this pervasive surveillance. Users might generally understand that the company retains and uses the specific attributes that they post to their profile. But most users certainly do not have a full picture of the depth and breadth of Facebook’s activities. Users rarely are informed, for instance, that Facebook buys troves of credit-card purchasing and profile data from the large data marketing firms. A user must poke around or search Facebook’s help site to discover this fact. This mix of the information we offer to Facebook, Facebook’s ability to track us on the web and in the real world, and the commercial credit data it purchases empowers Facebook and disempowers us.
The chief danger from the Facebook commercial surveillance system lies in the concentration of power. No other company in the world—with the possible exception of Google—can even consider building a set of personalized dossiers as rich as Facebook’s. These data reinforce Facebook’s commercial dominance in the advertising business (again, mostly shared with Google, which has different ways of tracking and targeting content and advertising but generates many of the same risks and problems). The very fact that we cannot expect another digital media company to generate that much data from that many people and that many interactions means that—barring strong regulation—serious competitors to Facebook will be rare or nonexistent in the near future.
But there are other dangers that come with Facebook having and holding all of this information on us. They come from the two other surveillance positions: peers and states. Many common behaviors of Facebook friends sever our images or information from our control, regardless of how careful any individual is with privacy settings. Other Facebook users can act maliciously, especially when relationships degrade. And other Facebook users might be more promiscuous in their habits of tagging photographs of people who would rather not be identified beyond a tight circle of known friends.
Beyond this, Facebook profiles can be abused for the purposes of public shaming, harassing, or exposing personal information to outsiders. What we put on Facebook is often carefully selected and managed, a constant if exhausting exercise in self-promotion and self-presentation. That means that Facebook profiles are rarely if ever full and accurate portrayals of our lives and personalities. That’s one reason Facebook goes to great lengths to monitor and record our actual activities and movements. We might want everyone to think we are vegan, but we might slip up and eat at Burger King in a moment of weakness. We should not have to reveal such moments to our friends. But Facebook ensures that it knows us better than our friends and family members do.
Still, the fact that Facebook profiles are inaccurate or inauthentic portraits of complex human beings means that actions and reactions by others peering at them can generate unfair or harmful reactions. Jokes can be misread. Declarations of loosely held opinions could blow up into misreadings that cause social conflicts. Facebook was designed to limit our interactions and exposure to the circle of those we trust. It no longer functions that way.
Despite the promises Facebook makes to its users, there are many ways that it ensures users lack control over their information. Privacy journalist Kashmir Hill noticed in 2017 a curious phenomenon. Facebook was recommending that she “friend” people she hardly knew or did not even know of. She asked her readers if they had had similar experiences, especially any that led to awkward or possibly harmful encounters via Facebook. Social workers and therapists reported being connected with clients despite never exchanging private information with them. A sperm donor was urged to connect to the child of a couple to whom he had donated sperm, despite the parents not wanting the donor to have contact with that child. Hill discovered that a Facebook feature called “People You Might Know” urged people to upload the address books from their computers or phones. Those email addresses and mobile phone numbers served as identifiers to Facebook profiles. And because Facebook’s social graph traced connections among profiles, the People You Might Know feature had the ability to connect people who were quite distant, estranged, hostile, or even violent toward each other.
Because no user could control what information lies in another’s address book, no user could opt out of the feature. Users are at the mercy of other people and their understanding of how Facebook uses personal information. “A one-night stand from 2008, a person you got a couch from on Craigslist in 2010, a landlord from 2013: If they ever put you in their phone, or you put them in yours, Facebook could log the connection if either party were to upload their contacts,” Hill wrote. “That accumulation of contact data from hundreds of people means that Facebook probably knows every address you’ve ever lived at, every email address you’ve ever used, every landline and cellphone number you’ve ever been associated with, all of your nicknames, any social network profiles associated with you, all your former instant message accounts, and anything else someone might have added about you to their phone book.” And there is nothing anyone can do about that. Users are tricked at the moment they register with Facebook to upload their contacts for the sake of convenience. Facebook never invites users to consider the consequences of that action.
State uses of Facebook are even more troubling. States do have the power and right to imprison and commit violence against citizens and those they consider threatening. State power leverages Facebook in two ways. First, and most common, we have seen authoritarian leaders in various countries monitor Facebook activity and track suspected dissidents and journalists. They use Facebook and WhatsApp to generate campaigns of harassment against perceived enemies and critics. States can use bogus profiles to infiltrate Facebook groups devoted to reforming or challenging the government, or even groups that offer support to gay and lesbian people. The 2013 revelations by Edward Snowden that the security and intelligence services in the United States and the United Kingdom had managed to tap into the data flows of Facebook, Google, Apple, Microsoft, Yahoo, and other companies showed just how vulnerable Facebook users are to state surveillance power. As long as Facebook retains such a rich source of intelligence, states will try to infiltrate the system.
From Antisocial Media: How Facebook Disconnects Us and Undermines Democracy by Siva Vaidhyanathan. Copyright © 2018 by Siva Vaidhyanathan and published by Oxford University Press. All rights reserved.