Future Tense

Facebook Is Making It Tougher for Ads on Its Platform to Discriminate Against Users

The Facebook logo is displayed on a sign.
Advertisers can tailor ads to specific audiences on Facebook based on users’ personal information, including their demographics, location, and interests.
Alexander Koerner/Getty Images

Facebook is scaling back its ad-targeting options in an attempt to prevent advertisers from discriminating against users, including by religion or ethnicity. Facebook announced on Tuesday that it will remove more than 5,000 ways of narrowing the audience for an ad. Now advertisers will not be able to hide their ads from Facebook users interested in topics such as “Passover,” “Native American culture,” “Buddhism,” “Evangelicalism,” and “Islamic culture.”

“While these options have been used in legitimate ways to reach people interested in a certain product or service, we think minimizing the risk of abuse is more important,” Facebook explained in its statement.

Facebook’s announcement comes on the heels of a complaint filed against the company by the U.S. Department of Housing and Urban Development. On Friday, HUD accused Facebook of facilitating illegal housing discrimination by creating advertising tools that enabled landlords and developers to filter out potential renters based on their race, gender, religion, or disability in violation of the Fair Housing Act. The complaint could lead to a federal lawsuit against Facebook. A Facebook spokesperson told Buzzfeed News that the timing of its announcement was unrelated to the HUD complaint.

HUD’s complaint is not the first time Facebook has been accused of violating federal housing laws. An October 2016 ProPublica investigation found that Facebook gave advertisers the option to exclude users based on their race, using a special category known as “Ethnic Affinities.” ProPublica was able to purchase an ad that targeted users who were interested in buying a house and excluded anyone with an “affinity” for African-American, Asian-American, or Hispanic people. When investigators showed their findings to civil rights lawyer John Relman, he said that the social network’s race-based targeting options may constitute a “blatant” violation of the Fair Housing Act. The 1968 law made it illegal to “make, print, or publish, or cause to be made, printed, or published any notice, statement, or advertisement, with respect to the sale or rental of a dwelling that indicates any preference, limitation, or discrimination based on race, color, religion, sex, handicap, familial status, or national origin.”

Advertisers on Facebook are able to tailor ads to specific audiences using a wide variety of customization options, including targeting users’ personal demographics, location, interests, and past purchasing behaviors. A September 2016 ProPublica investigation identified nearly 50,000 categories in which Facebook users are placed.

While some of these advertising categories—such as users whose property size is less than .26 acres—seem rather innocuous (albeit invasive), the precision with which Facebook has allowed advertisers to refine their target audience can have serious implications. In a report for Buzzfeed News, Alex Kantrowitz outlined the potential discriminatory applications of the targeting options slated for removal: “A business that didn’t want Jewish people patronizing it could exclude those interested in ‘Passover’ from seeing its ads. A landlord that didn’t want to rent to Native Americans could exclude those interested in ‘Native American culture.’”

In addition to removing thousands of ad-targeting options, Facebook announced that it would require advertisers offering housing, employment, or credit ads to undergo a nondiscrimination certification process to allow them to continue advertising on the site. The educational training is intended to “underscore the difference between acceptable ad targeting and ad discrimination.”

This latest move by Facebook is another attempt by the site to regulate how its vast troves of user data are used. In the aftermath of the Cambridge Analytica scandal, Slate’s Will Oremus argued that the data breach was not a one-off error, but rather a predictable outcome, given that Facebook is optimized to collect users’ personal information and repackage it for advertising. He wrote:

If you think of that data, and the ads, as a relatively small price to pay for the privilege of seamless connection to everyone you know and care about, then Facebook looks like the wildly successful, path-breaking company that made it all possible. But if you start to think of the bargain as Faustian—with hidden long-term costs that overshadow the obvious benefits—then that would make Facebook the devil.

As long as users’ personal data remains a tantalizing way to generate profit, advertisers may find creative ways to exploit Facebook’s ad-targeting tools. This latest move will make that tougher—but it probably won’t make it impossible.