After years of lawsuits from civil rights organizations and investigations from journalists, Facebook announced on Tuesday that it would no longer let advertisers who post ads on housing, jobs, and credit services to target users based on their race, age, or gender. Additionally, housing ads will no longer be allowed to target people in a certain zip code—another way to limit your audience by race. The change comes as a result of a settlement to resolve five discrimination lawsuits over the past three years from civil rights groups like the National Fair Housing Alliance and the American Civil Liberties Union, as well as the Communications Workers of America and multiple individuals who alleged that Facebook allowed advertisers to prevent certain groups, like women over 40, families with children, and black Americans, from seeing offers for housing and employment.
“There is a long history of discrimination in the areas of housing, employment and credit, and this harmful behavior should not happen through Facebook ads,” Facebook Chief Operating Officer Sheryl Sandberg wrote in a statement. Sandberg added that Facebook is making a separate portal for users posting ads on housing, jobs offers, and credit products, which won’t have the same targeting categories as advertisers for other products and services, and which will still be able to sell ads shown to people in certain demographic groups and within certain geographic areas. The changes are slated to go into effect by the end of the year.
These changes are a huge win for civil rights groups—but they also come after years of fighting with Facebook over whether its advertising system runs afoul of federal and state laws that prohibit discrimination in certain ad categories. Federal housing law, for example, prohibits discrimination based on race, religion, national origin, gender, disability, or family status, but it’s typically enforced against the person or agency that posted the ad, and not the company that published it. But that hasn’t stopped multiple cases from moving forward against Facebook, including a complaint still pending from the Department of Housing and Urban Development, which alleges that Facebook’s ad system paved the way for illegal housing discrimination by providing tools to help real estate outfits discriminate based on race, gender, location, and religion.
Removing certain kinds of targeting from Facebook may well help protect people from some discriminatory practices in the sensitive areas of housing, employment, and financial products. But the very nature of Facebook’s targeting platform means it won’t preclude all kinds of discrimination for home-, job-, and credit-seekers. Targeted advertising is Facebook’s bread and butter, making up 99 percent of the company’s revenue. Last year, that amounted to nearly $56 billion from advertising. Facebook’s advertising business is so successful in part because ads can be microtargeted to people based on any number of interests—not just their race, age, gender, and location. And there are all kinds of targeting categories that can signal that a person is black or white or male or female or young or old.
Look at the advertising categories used by Russian trolls from the Internet Research Agency in ads placed on Facebook before the 2016 election. Many of those ads were targeted to people interested in “Understanding racial segregation in the united states” and “Martin Luther King, Jr.” and “Black is beautiful” and the “African American Civil Rights Movement (1954-68).” While there are certainly many, many people who are not black who would be interested in these topics, the goal of this targeting was ostensibly to hit a black audience. Likewise, if an advertiser for a job posting wanted to exclude young people, they might target an interest in newer pop artists who are unlikely to be on the radar of people over 50. In other words, Facebook is a targeted advertising business that is successful because the ads it sells can be targeted in extremely fine-tuned and granular ways—there’s no need to specify that you want to reach people in a white neighborhood or women under 40 to get your desired audience.
Housing and credit companies have been using targeted online ads to reach or exclude certain communities for years—and not just on Facebook. Take what happened a decade ago when mortgage brokers were peddling dangerous loans during the subprime mortgage crisis. During that time, financial companies were leveraging online behavior data with location and demographic data in order to deduce a person’s race. That, in turn, helped mortgage companies target minorities in their online marketing of risky financial products, which were sometimes referred to as “ghetto loans,” according to research from Seeta Peña Gangadharan, a professor of media and communications at the London School of Economics. Subprime lenders made the list of the top 10 spenders in online advertising in the U.S. in 2007 and 2008, according to Nielsen/NetRatings data.
Beyond the targeting, it’s also not clear how Facebook plans to keep advertisers accountable with the new system. “People will have to either self-identify as a real estate broker, a landlord or an employer, or Facebook is going to have to identify them,’” Dennis Yu, chief executive of BlitzMetrics, a digital marketing company, told the Washington Post. Compliance, however, might be aided by the watchdogs that brought the legal challenges. Facebook is paying out less than $5 million to settle the various lawsuits, some of which is going to the National Fair Housing Alliance, and it will use those funds in part to help train advertisers and Facebook employees to stay in compliance with civil rights housing laws. Still, Facebook’s advertising system allows ad buyers to reach more than 2 billion people, which is only possible because of automation. Policing the platform will by necessity require some automation as well. And automated systems can be gamed—as Facebook knows quite well. Now the question will be whether any automated advertising system that allows fine-tuned targeting is capable of protecting users’ civil rights.