Facebook has disabled its topic recommendations after the artificial intelligence-powered feature mislabeled a video of Black men as “primates.” Facebook users who watched a video by the Daily Mail dated June 27, 2020 of Black men in altercations with white civilians and police officers were asked whether they wanted to “keep seeing videos about Primates.”
After it was brought to Facebook’s attention the company apologized. “This was clearly an unacceptable error and we disabled the entire topic recommendation feature as soon as we realized this was happening so we could investigate the cause and prevent this from happening again,” Facebook spokesperson Dani Lever said. “As we have said, while we have made improvements to our AI we know it’s not perfect and we have more progress to make. We apologize to anyone who may have seen these offensive recommendations.”
Facebook didn’t catch the problem itself but rather a former content design manager at the company got a screenshot from a friend and she posted it to a forum for current and former employees, reports the New York Times. She also posted it on Twitter.
This is the latest controversy involving artificial intelligence that have displayed gender or racial bias. In 2015, for example, Google Photos labeled pictures of Black people as “gorillas.” Google said it was “genuinely sorry.” Then in 2016, Microsoft shut down its chatbot Tay after it started using racial slurs. Last year, Facebook said it was analyzing whether its algorithms that were trained using artificial intelligence were racially biased. Within Facebook there has also been controversy regarding racial issues. A few years ago, CEO Mark Zuckerberg called on employees at the company’s Menlo Park, California headquarters to stop scratching out “Black Lives Matter” and writing “All Lives Matter” in a public space in the company.