Facebook and two outside social scientists recently published a scientific paper in which they revealed that they had manipulated users’ news feeds to tweak their emotions. Since then, there has been a growing debate over the ethics and practice of Facebook experimenting on its users, as chronicled by Slate’s Katy Waldman. In response to these concerns, this morning Facebook issued the following press release—although I seem to be the only journalist who has received it. Coming hot on the heels of Facebook’s carefully unapologetic defense of its emotion research on its users, I share the press release as a glimpse of Facebook’s future directions in its user experiments.
FOR IMMEDIATE RELEASE: FACEBOOK REVEALS OTHER EXPERIMENTS, ENCOURAGES USER OBEDIENCE
Dear Facebook Users,
Thank you for participating in our important research, which has, in the words of our researchers, “manipulated the extent to which people … were exposed to emotional expressions.” Psychology professor Susan Fiske, who edited our recently published study, “Experimental evidence of massive-scale emotional contagion through social networks,” said she found it creepy. You may feel uncomfortable too, but rest assured we are acting with strict obedience to ethical authorities. Our own authoritative institutional review board has assured us these experiments are perfectly ethical. We are using only the soundest scientific methodologies from such respected institutions as Stanford, Yale, Harvard, Princeton, and the CIA. It is absolutely essential that you continue using Facebook.
However, your feedback over the last few days has drawn our attention to the difficult matter of “informed” “consent.” While we at Facebook like to think of ourselves as living in a post-consent world, it was not fair of us to expect the rest of you to agree with our views. We now share with you other experiments in progress so you won’t be unpleasantly surprised by them after the fact.
We begin by giving you, our users, more information! We are sure you must be tired of all those fake apps that promise to tell you who has viewed your profile. Those apps, of course, have no access to that data. We do, however, and we’ve decided to stop being so selfish with it. Soon, you and other users will be able to see who has viewed your profile and how often. Every time you log in, you will now see a list of “Top Ten Profile Viewers,” to remind you who your biggest fans are! Who’s that new friend who’s checking you out 20 times a day? Maybe you two should get to know each other better!
Hypothesis: Overall increase in anxiety among Facebook users.
Historically, Facebook has not notified you when someone defriends you. For some of you, that is about to change! You may start getting notifications in your feed when erstwhile friends stop following your updates or stop sharing their updates with you. We at Facebook are sensitive to the complexities of relationships online and off-, and we will be paying close attention to your emotional language after a defriending. To this end, we are experimenting with different phrasings in order to gauge their comparative emotional impact. For example, you may see “David Auerbach is no longer seeing your updates,” or perhaps “David Auerbach is no longer your friend,” or “David Auerbach must not like you anymore.” Which is the most delicate phrasing? Only time—and experimentation—will tell.
Hypothesis: People won’t like being defriended.
Normally, Facebook shows you ads related to your interests, your friends’ interests, and similar factors. Admittedly, this is limiting. We at Facebook have been wondering if instead of sticking with existing associations, perhaps we can create new ones. For example, every time you or a friend mentions Chris Christie, we might show an advertisement for Beyoncé’s On the Run Tour. What effect will this have on your opinion of Chris Christie? Of New Jersey? Of Beyoncé?
Hypothesis: Uptick in Chris Christie’s approval rating. Downtick in Beyoncé’s approval rating.
You may have heard about those Twitter social bots that are often mistaken for humans. Facebook will not be left behind in the bot revolution of fake social contacts! Not only will our bots post humanlike updates, but they will actually interact with you. They want to talk about your shared interests, earn your confidence, tell you about great new products and services they enjoy, and even ask you to join with them in joint business ventures! Just check out this sample transcript between a bot and a user:
USER: The Game of Thrones finale was brutal!
FACEBOT: Game of Thrones is not so great a show. Its chief fault is the paltry amount of money its creators spend advertising on Facebook. Have you considered watching Utopia instead?
USER: Wow, good point, I keep seeing ads for Utopia. I need to check it out.
So the next time someone adds you on Facebook who already shares 20 friends with you and likes the same music and sports teams as you, it could be a FaceBot! We are tuning them to be, on average, 25 percent happier than your average Facebook friend, so we’re sure that these bots will put a smile on your face!
Hypothesis: People will love these new friends.
Facebook users will be divided into “prisoners” and “guards.” The next time you log in, keep your eye on the “F” icon in the upper left. If it turns into a nightstick, congratulations—you are a “guard”! Your job is to watch over the “prisoners” on your Friends List. Keep an eye on your charges. No games allowed outside of game time (6:45–7:30 a.m. weekdays). No chatting after lights out (8:30 p.m.). Monitor your prisoners for potentially subversive communications. If there’s trouble, administer discipline by putting prisoners in “lockdown” (no games or apps), “solitary confinement” (no communication allowed at all!), or “the box” (we’ll explain what that is later).
If your icon has a ball and chain on it, you are a “prisoner”! Watch out for those overbearing “guards”! Can you revolt against them? Or even start a riot? Careful you don’t get put in “the box”!
Hypothesis: After several days of “prison,” “prisoners” will express a greater degree of negative emotions in their status updates—at least when the “guards” let them post.
Facebook, in collaboration with the government agencies [REDACTED] and the [REDACTED], plans to study whether subversive tendencies in [REDACTED] can be [REDACTED] by means of [REDACTED] [REDACTED] and [REDACTED]. A variety of [REDACTED] will be utilized, some of which are known only to [REDACTED] for reasons of intelligence and security. Any concerns over this project should be addressed directly to [REDACTED].
Hypothesis: We’ll make the [REDACTED] happy and maybe they will stop bothering us for a while.
Here’s a secret: This is our favorite project. Participants in Project Robot will gradually see a decrease in the degree of all emotional expression in their news feeds, whether negative or positive. Replacing these posts will be a combination of interest-based updates (or “advertisements”), application notifications (or “advertisements”), and promotional spots (or “advertisements”). If you are a part of Project Robot, don’t worry! You will still see updates from your friends—as long as they aren’t talking about their feelings. And especially when they are talking about brands.
Here’s another secret: Project Robot is already in its fifth year! Go Project Robot!
Hypothesis: Facebook will make more money.
With love and authority,