Almost two weeks into the war in Ukraine, there continues to be a large amount of misinformation going viral in all forms of media. In response to this, we’ve produced a roundup of claims that we’ve seen on social media about the war. It’s important to identify emerging narratives so that we can quash them. Some of these are rehashed versions of narratives that we already wrote about last week, but some are new and have now combined with other conspiracy communities, including COVID skeptics.
“Russia is destroying biolabs.”
There are biological research labs in Ukraine that partner with the U.S. Department of Defense, but they do not do weapons research; they do disease monitoring. Russia is not bombing those labs. This narrative has been seeded by Russian propaganda outlets over the past two weeks, including the foreign ministry. The Russian state media outlet TASS published documents that they claimed prove NATO and the US were developing biological weapons at labs in Ukraine.
As is often the case, narratives have become slightly jumbled here, as biolab conspiracy theories have long been part of COVID-19 narratives and misinformation. Some members of the fringes in politics and media have amplified this narrative, using a clip of Senator Marco Rubio asking Undersecretary of State Victoria Nuland in U.S. Congress whether Ukraine had biological weapons. Members of conspiracy communities have been claiming since the start of the pandemic that COVID was an engineered pathogen, or perhaps even a purposefully created biological weapon, and biological labs in Ukraine have now been connected to that narrative.
Screenshot from a COVID conspiracy group on Telegram
This disinformation narrative has been used at least since 2018, when Russia employed it against the Republic of Georgia.
Read the full fact check here.
“The war is fake.”
“Soldiers are using fake weapons,” “these people aren’t actually dead,” “this woman is actually a crisis actor!” These are various claims based around the narrative that the war is fake, and can be found on almost every platform. With old repurposed footage and audio tracks from other conflicts readily available on the Internet, narratives like these have the potential to become deeply confusing. The BBC has released a useful list of some of the most viral claims and Logically will continue to debunk them as these claims appear.
Much like the previous biolabs narrative, this one has been seeded by Russian state media. The New York Times has reported that some Russians living in Ukraine have been met with disbelief upon sending footage of the conflict back to their families in Russia. However, the narrative goes both ways – one of the central tenets of QAnon conspiracy theories is that the politicians have been replaced by actors and that many world events are staged, so these communities seem to have organically generated that narrative.
“Putin is battling the New World Order.”
This narrative has some overlap with the “war is fake” narrative. The “Great Reset” or “New World Order” conspiracy theory has been well-trodden by U.K. and U.S. COVID conspiracists, alleging that a shadowy group of elites is conducting this war for the benefit of the few, to distract the world from the evil it promotes. Others have advanced the idea that Putin is actually the enemy of these shadowy elites, and this is why he’s launched the attack - to take down the supposed New World Order. In this version of the conspiracy, the Deep State-Satanic-Pedophiles metanarrative framework guides interpretations of world events. It’s important to note that the New World Order conspiracy relies on antisemitic tropes to make its arguments. We can see this happen in this post from a sovereign citizen conspiracy group on Telegram of about 10k people, which ties together many of the narratives we’ve discussed here:
The discussion of “biolabs” and “human trafficking” in the same sentence indicates that the metaconspiracy still frames these conspiracists’ understanding of world events.
“Russia’s war is not as bad as anything the U.S. or NATO has done.”
As we reported last week, this is one of the disinformation lines that Russia is using to draw attention away from its own war. Russia often employs multiple contradictory narratives to fine-tune its disinformation narratives within completely different communities.
One such tactic saw TASS posting alleged evidence that NATO was planning an attack on Russia from the Donbas region of Ukraine.
The idea was that the attack was justified because in this version of events, Russia launched a preemptive attack. However, there was no such evidence of this attack before Russia invaded Ukraine, nor had there been any attacks on Russia from Donbas. The evidence did not seem credible, as it was derived from an old laptop with NATO stickers – but again, as a disinformation tactic, the point is not always to be completely convincing, but just to create so much chaos that it creates the impression of no objective truth.
Another version of this narrative has a “whataboutist” flavor, whereby Russia’s war might be bad, but actually, other governments are worse. Vice’s David Gilbert reported last week that one such map, which had gone viral, was created by a Russian propaganda outlet based in Berlin. Notably, the map does not count Crimea as part of Ukraine, and it is inaccurate, as it greatly downplays Russia’s missile strikes in Ukraine.
Read the full fact check here.
“Ukraine is full of Nazis.”
Some of our fact checks concern images that claim innocuous things, but actually contain covert far-right or neo-Nazi symbols. Sharing images that contain far-right symbols carries a risk, both because it can bolster Russian propaganda claims that Ukraine is full of Nazis, and because far-right groups often use this as a trolling tactic to increase their reach and visibility, thus aiding them in recruiting efforts. An innocuous image is shared, such as a soldier with a cat, only the soldier has a symbol associated with Nazi or far-right movements adorning their uniform. People who are familiar with the symbol react with what looks like disproportionate fear or disgust, which only increases engagement on the original post, and makes uninformed onlookers curious. As a tactic, it discredits those with knowledge of extremism, gives the far-right fringe an in-joke to laugh about, and potentially opens the door to recruitment for the more curious.