On this week’s If Then, Will Oremus and April Glaser discuss California’s landmark decision to eliminate cash bail for defendants in criminal cases—and the controversial algorithmic “risk assessment” system that will partially replace it. They also hash out a fresh debate over who gets to fact-check the news that appears in your Facebook feed following an outcry in media circles on Tuesday, after Facebook flagged a story in the liberal outlet ThinkProgress as “false”—all because the conservative Weekly Standard had taken issue with its headline.
The hosts are then joined by professor Safiya Umoja Noble, author of Algorithms of Oppression: How Search Engines Reinforce Racism. Lately, media coverage—and congressional hearings—have focused on potential anti-conservative bias among the big tech companies, but Noble’s work suggests we may actually have a much different problem.
17:50 - Interview with Safiya Umoja Noble
36:36 - Don’t Close My Tabs
Stories discussed on the show:
- The Guardian: Imprisoned By Algorithms: The Dark Side of California Ending Cash Bail
- Slate: When Fact-Checking Becomes Censorship
- ThinkProgress: Brett Kavanaugh Said He Would Kill Roe v. Wade Last Week And Almost No One Noticed
- Algorithms of Oppression by professor Safiya Umoja Noble
Don’t Close My Tabs:
- Anatomy of an AI System by Kate Crawford and Vladan Joler
- New Yorker: Can Mark Zuckerberg Fix Facebook Before it Breaks Democracy?
Podcast production by Max Jacobs.
If Then plugs:
You can get updates about what’s coming up next by following us on Twitter @ifthenpod. You can follow Will @WillOremus and April @Aprilaser. If you have a question or comment, you can email us at firstname.lastname@example.org.
If Then is presented by Slate and Future Tense, a collaboration among Arizona State University, New America, and Slate. Future Tense explores the ways emerging technologies affect society, policy, and culture. To read more, follow us on Twitter and sign up for our weekly newsletter.