The war against truth has entered its most diabolical phase: false flag fact-checking. For several years, propaganda wings have simply masqueraded as fact-checkers to label untrue content as genuine or genuine content as untrue. Now, Russia’s invasion of Ukraine has spawned a new variant of false fact-checking, wherein the propagandist creates their own obviously fake content just to debunk it. These bad-faith fact-checks are even more difficult to dispel than those that simply lie about truth or falsity, and they are equally capable of inflicting a post-truth worldview.
Prior to recent developments, false fact-checkers have largely existed to lend credence to misinformation. For example, false fact-checking groups have affirmed that the Armenian genocide never occurred or that Russia had no part in the poisoning of Kremlin enemy Sergei Skripal and his daughter. False fact-checkers have also sought to undermine factual reporting. For example, during the Saudi effort to smear Hatice Cengiz, the fiancée of assassinated and butchered journalist Jamal Khashoggi, as an undercover foreign spy, false fact-checkers claimed that pictures of the couple were all photoshopped.
By contrast, false flag fact-checkers can use real photographs or videos and draw true conclusions—making them trickier to debunk—but then lie about the deployment of those images. For instance, a propagandist doctored a genuine image of a burning tank by adding a Russian-identifying Z symbol on the wreckage, then purported to debunk the constructed image by pointing to the genuine photo, and then used that “fact check” as evidence that all images of Russian losses in Ukraine are likely misinformation. The trick here is that the constructed image was never actually used.
The debunking is “correct” in that the image is in fact constructed or miscaptioned. However, the “original” misinformation story is a fabrication of the bad-faith fact-checker itself—that is, the fact-checker created the misinformation (which was never widely distributed) just to debunk it, thereby creating a misinforming fact check for distribution. This misinforming fact check is then laundered by the Russian state media, which reports the fact check as evidence that Russians cannot believe social media posts about the war.
The false flag approach from Russian propagandists aims to spread two complementary narratives: The Western press broadcasts obvious lies about Ukraine, and social media videos of negative actions in Ukraine are fabrications. The benefits of this approach are twofold. First, it allows the wrongdoer to discredit legitimate Western news sources by association. Secondly, it allows them to claim the social media videos upon which the media rely are, themselves, false. These actions are designed not to convince users of a particular narrative, but to flood information channels with enough contradictory trash to poison the source.
Russian Twitter channels are pressing the narrative of Western media spreading obvious lies regarding Ukraine, thereby sowing doubt and providing justification for a Russian governmental ban of those news sources. This largely consists of creating phony news posts or broadcasts, containing obviously false content. The simplest variant of this has consisted of pushing verifiably false statements from imitation CNN accounts, such as CNNUKR and CNNAfghan. The more complex variant involves fabricating false news broadcasts and posting the clips with a “fact check.” For example, propagandists took a genuine video of a climate change protest in which protesters donned body bags, then mixed in genuine audio of a real news report on Ukraine, to create an incorrect report with Western anchors seemingly unaware of the moving “dead” crisis actors in the background.
Russian Twitter users are also generating fake media and attributing those images to Ukrainians. The propagandists do so by altering footage, claiming that pro-Ukrainian users are disseminating that footage, and then debunking the altered images and videos. For example, ASB News is a recently suspended propaganda account pushing misinformation in support of the Russian invasion (such as the specious biolab claim). ASB News has featured several fact checks showing that the Russian-identifying Z was added to older footage.
However, these fact checks do not point to any genuine dissemination or use of the altered image. Indeed, a reverse image search shows the altered image has appeared only in the context of the false fact check. An examination of the metadata of the video files in these supposed fact checks also suggests that the pro-Ukrainian fakes were created by the Russian fact-checkers themselves.
This novel method is worrisome. It may undermine the ability of Russians to comprehend the actions of their government in Ukraine. Recent polls suggest that roughly 75 percent of Russians who claim to support the war trust official Russian sources, though of course reliable polling is difficult to achieve in Russia. More broadly, however, this method is likely to spread. American propagandists have shown a disturbing willingness to use the same tactics as their foreign counterparts. For example, Russian propaganda campaigns from 2014 onward about Ukraine’s brutalization of separatists feature miscaptioned photos—that is, genuine photos paired with misidentifying text. For example, Russian propagandists have repeatedly used photos taken in Israel and claimed they show civilians in Donbas in eastern Ukraine.
This exact same technique was widely used in propaganda demonizing a migrant caravan in 2018. For example, Ginni Thomas and right-wing supporters tweeted genuine images of bleeding Mexican officers, falsely linking these images (which were unrelated to migrants and were taken years before) to caravan members.
The Gaza-for-Ukraine swap was actually reversed by Katrina Pierson in an attack on Rep. Ilhan Omar. In 2019, Pierson tweeted a 2015 video of rocket fire in Ukraine or Belarus, paired with the text “650 Rockets being fired into Israel from Gaza,” and asked if Omar would condemn it.
Pierson subsequently claimed that the video was used merely “to underscore what hundreds of rockets would look like,” though fact-checkers pointed out that the “ordered and rhythmic” artillery barrage was unlike those in Israel/Gaza and thus “the video fails as a representation.”
The misinformation ecosystem is replete with borrowing of method and subject matter: Russian propagandists also take note of domestic conspiracies in a vicious feedback loop. One particularly pertinent example is the “Ukrainian biolabs” claim, which has been seeded for years by Russian propagandists but was not part of the initial Russian propaganda push (which focused on the slur that President Volodymyr Zelensky is a “Nazi”). Instead, the claim appears to have taken on new life in QAnon accounts in late February. The biolab claim then jumped to right-wing influencers and anti-vaxxers (already primed by trafficking in COVID biolab strains of misinformation), then to Fox News in commentaries by Tucker Carlson and Sean Hannity, with a readoption by Russian propagandists (and subsequent repetition by China). Now it has been retooled by the right as an attack on Hunter Biden.
It is hardly a stretch to imagine that the new false flag technique will be deployed for the 2022 midterm elections (perhaps targeting audit-hungry individuals or newly created election police). False screenshots of CNN war coverage are our present; false broadcasts of election coverage are likely our future.
One of the best tools to defeat some variants of this method is reverse image searching. This searching can reveal that the “fact-checked” image was never actually used. For example, a reverse image search of the doctored captured tank image shows that it never appeared outside of Pro-Russian forums and propagandist channels. Alternatively, reverse image searching can reveal the source of the original media. For example, a reverse image search of the moving body bag video shows that it is an earlier climate protest.
However, social media providers must take steps to facilitate reverse image searching by users; research shows that mere disclaimers are likely not enough. Platforms could embed the function directly on their platforms, so users can more readily access the tool. Similarly, they could streamline a “report misinformation” tool with fields specific to miscaptioned or misattributed images. Ukraine is, unfortunately, going to serve as a proving ground for the latest innovations in misinformation. We must be prepared for the next wave of lies.