In a March 21 Slatest, Mark Joseph Stern misstated that the April 2019 Wisconsin Supreme Court election could give Democratic justices a majority. That opportunity will not arise until the 2020 election.
Due to an editing error, a March 20 Future Tense Newsletter incorrectly stated that the National Institute of Standards and Technology has been using nonconsensually obtained images to train its Facial Recognition Verification Testing program. The NIST does not develop or train facial recognition systems. It provides independent government evaluations of prototype face recognition technologies. The newsletter also summarized an earlier Future Tense that misstated that boarding photos from airports had been used as a data set for testing facial recognition algorithms. The article that the newsletter refers to and the newsletter have both been corrected to say that those images came from a Department of Homeland Security scenario and involved volunteers.
In a March 20 Slatest, Molly Olmstead misgendered former Columbus, Ohio, Police Chief Kim Jacobs. Jacobs is a woman.
In a March 19 Future Tense, Shannon Palus misspelled Winn-Dixie.
In a March 19 Slatest, Elliot Hannon misstated that England’s National Gallery refused a $1.3 million donation from the Sackler family trust. The gallery and the Sackler Trust said in a statement that the decision “not to proceed at this time” was a joint one.
Due to a production error, a March 19 What Next show page originally had an incorrect audio link.
In a March 18 Science, Shannon Palus misstated that Star Talk would be put back on the radio. It will be put back on television.
In a March 17 Future Tense, Os Keyes, Nikki Stevens, and Jacqueline Wernimont misstated that the National Institute of Standards and Technology has a “regulatory” role on facial recognition under a newly released executive order. NIST does not create regulations, but is a testing body, and under the new executive order has been given 180 days to develop a plan for standards. The executive order also refers more broadly to artificial intelligence, not specifically to facial recognition, although artificial intelligence does include facial recognition software and tools. The article also misstated that boarding photos from airports had been used as a data set for testing facial recognition algorithms. Those images came from a Department of Homeland Security scenario and involved volunteers.
A March 15 What Next transcript misstated that the Obama administration stopped using private prisons to detain immigrants during its later years. The administration directed the Federal Bureau of Prisons to phase out the use of private facilities, but it did not direct Immigration and Customs Enforcement to do so.
In a March 13 World, Michelangeo Freyrie misidentified the aircraft in footage of an attack as a B-21 bomber. It was a B-52 bomber.
Due to a production error, a March 13 The Angle misidentified the Boeing Max 737 8 as a 757 Max.
In a March 11 Future Tense, Jeff Wise misstated that in order to make room for the bigger engine in the 737 Max, Boeing engineers moved the point where the airplane attaches to the wing. They moved the point where the engine attaches to the wing.
Slate strives to correct all errors of fact. If you’ve seen an error in our pages, let us know at firstname.lastname@example.org. General comments should be posted in our Comments sections associated with each article.