If you logged on to Facebook Wednesday night or Thursday morning, you may have seen multiple notifications that your favorite groups changed their privacy settings.
I’m in more than 80 groups, which I recognize is entirely too many, but that allowed me to see a pattern: The ones changing their settings didn’t include the neighborhood news groups, outdoors communities, or semi-professional writing groups. They were primarily meme groups with thousands of people.
Within the groups, threads racked up hundreds of comments as members tried to piece together what happened, reporting that they, too, had seen multiple groups changing their privacy settings at once. Groups have several privacy options: Public groups allow anyone to join, closed groups allow anyone to request membership, and secret groups do not appear in searches and new members must be added by an existing member. In my notifications, about a dozen public and closed groups I’m in changed their settings to private, while a few “archived” the group to prevent new members and posts.
Some group administrators explained their changes in posts, saying that other meme groups have recently been banned by Facebook without warning or explanation, and that changing their group settings is meant to be a precaution against a similar fate. As Facebook plans to pivot toward encouraging and supporting groups, the mass panic among these popular groups’ admins and users illustrates the difficulties with maintaining private spaces on a platform governed by opaque banning procedures.
It’s a popular theory among internet denizens that meme groups in particular are being targeted for alleged bans. “I’ve only noticed it happening in shaming groups,” said one user, referring to groups in which people post items for other members to ridicule. (One notable example: wedding shaming groups, which include everything from awful bridesmaids dresses to horror stories about out-of-control guests.) “Anything shitposty, shaming, leftbook – all getting infiltrated and mass posts reported to where the group gets zucced,” another user wrote. If you don’t spend a lot of time perusing internet memes, shitposts are intentionally terrible, troll-y posts, and Zucc’ed, short for Zuckerberg, is a popular way of referring to being banned. Among the biggest groups apparently removed include “Crossovers nobody asked for,” which had more than 448,000 members, and “That relationship sounds exhausting,” which had about 54,000 members.
Rumors have been flying about what caused these bans. According to Facebook’s community standards, groups can’t support terrorist organizations or hate groups, murderers, or criminals, sell drugs, or attack individuals. It’s also against their policies for users to post “objectionable content” like hate speech, violent images, or porn, but there are no explicit policies for what action Facebook might take if such content appears in a group. The consequences, Facebook’s standards say, “vary depending on the severity of the violation and a person’s history on the platform,” which suggests individual posters would be penalized, not entire groups. Nevertheless, groups have disappeared, and former members have no clue why.
The most common claim about these recent bans is that they can be traced to a group of users called the Indonesian Reporting Commission, who took issue with offensive content in meme groups, but there isn’t a clear origin point for this theory. There are also slight variations on the story. Some say the group’s creator is responsible for the bans—that he joined groups, spammed them with porn, and then reported them for violating Facebook’s rules. Others say this individual recruited more Facebook users to do the same thing. Mark, an admin for a conspiracy meme group I’m in, heard that the individual “wrote a bot” to report groups and get them banned. According to a post on Know Your Meme, that individual has allegedly posted a public apology, but that hasn’t stopped people from doxing and harassing him—a group of users even boasted online that they beat him up.
As is often the case with internet rumors, there’s very little hard evidence of anything right now. It’s unclear whether Indonesian Reporting Commission, which does appear to have its own page (as well as many spoofs), is really connected to the groups’ deletion since Facebook does not reveal the identity of the users who report a group—a generally wise policy, but one that doesn’t lend any transparency to this situation. It does seem like the deletions were the result of a deliberate effort to remove groups, though. A Facebook spokesperson says, “We removed several Groups from Facebook after detecting content that violated our policies. We since discovered that this content was posted to sabotage legitimate, non-violating Groups. We’re working to restore any Groups affected and to prevent this from happening again.”
The one thing that’s for sure is that several popular groups have indeed been deleted and that these rumors, whether they’re true or false, has spurred a mass hiding of groups. “I’m one of a group of admins for a just-for-fun conspiracy theory group, so going along with this panic was actually perfect on-brand for us,” says Mark. Though changing the group to “secret” was initially a little tongue-in-cheek, Mark says that given the evidence that groups really were being shut down, “it seemed worth it to err on the side of caution.”
Facebook groups have been around a long time, but they’ve taken on a different role over the past decade. At first, people joined groups that signaled something about their personality and background to others: “Manatees are selfish,” “I will go slightly out of my way to step on that crunchy-looking leaf,” “Let’s save Africa in our Uggs.” Now, groups have become a place where you can post about silly, inconsequential stuff without co-workers from three jobs ago seeing it. Groups offer a way to meet new people and see different viewpoints about shared interests, whether that’s politics, shitposting, the outdoors, or a love of public transit. (You may notice I’m not naming the groups I’m in. I feel weirdly protective of them and want them to remain private enclaves where like-minded folks can gather without an onslaught of new members or, even worse, trolls.)
What starts as a lighthearted meme group can shift into something more intimate. I’ve met some of my favorite people through a series of groups that broke off from a larger group dedicated to a webcomic popular in the mid-2000s. We’ve supported one another through divorces and raised money for overdue rent and medical expenses. I would be absolutely devastated if our little corner of the internet suddenly disappeared.
I’m not alone in that. Annie, another member, says she’s found a close-knit community through our Facebook groups. “They’ve sent gifts to my new babies and held them if I’ve had the good fortune to meet up in person,” she says.
With overall Facebook use declining, the company knows groups are where it’s at. Just a couple weeks ago, it unveiled a new app design that privileges groups, a move that appears to be part of Zuckerberg’s bigger strategy to reconvince users the platform is dedicated to their privacy. “Today we already see that private messaging, ephemeral stories, and small groups are by far the fastest growing areas of online communication,” Zuckerberg wrote in a Facebook note in March.
If Facebook really is pivoting toward supporting groups, there are clearly some kinks to be worked out here. One major question is how to moderate these private groups. The entire appeal of these spaces is the freedom to be real and raw, and oversight from the platform destroys that illusion. But there have to be policies to prevent dangerous groups from festering, or harassment from running rampant. At the moment, groups mostly self-police with moderators, who sometimes end up devoting hours a day to approving new members and posts, and dealing with inter-member disputes. “Groups live or die by the uncompensated labor of their mods and admins,” says Mark. For many groups, after all the work mods put in to keeping things going and users put in to meme-ing, it can be a real slap in the face to know that the powers-that-be at Facebook could shut things down without so much as an explanation for why their space has disappeared.
Whether or not groups are really being Zucc’ed right now, the mass panic about the Zuccening of 2019 is a clear indication of how much groups mean to Facebook users, and how quickly (mis)information can spread among admins when group ban policies are so opaque.
In the meantime, the Zuccening is producing the usual mix of homophobic or racist shitposts and wholesome content, like “if we’re zucc’d in the night, tell the tag groups I love them.” As always, the meme ecosystem rolls on.
This piece was updated to include a statement from Facebook.
Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society.