<img src="https://trc.taboola.com/1321591/log/3/unip?en=page_view" width="0" height="0" style="display:none">

Fact Check with Logically.

Download the Free App Today

Fake Twitter Account Drums up Neo-Nazi Support

Fake Twitter Account Drums up Neo-Nazi Support

Over the weekend, a group of white supremacists wearing masks rented a bunch of U-Hauls and trucked themselves into D.C. to stage a rally. Footage of the rally shared on Twitter quickly went viral. The anti-fascist and disinformation research communities started sharing images, videos, and takes on the fascist march. Although many of the accounts sharing the footage belonged to journalists, one account shared some original footage that was retweeted by thousands. The account was only a month old, and up until then, had merely retweeted others’ posts. 

Tweet from @SherylLewellen: Happening now: About 500 men with riot shields are marching in WashingtonDC

The next day, the profile changed its photo, banner, and name to pro-Patriot Front slogans. Suddenly, everyone who had retweeted the footage, or quote tweeted it with their take, looked like they were endorsing Nazis. 

Tweet from @SherylLewellen with Reclaim America as the account name: Happening now: About 500 men with riot shields are marching in Washington DC

“This account was created with a fake name and an AI-generated profile in order to fool journalists into promoting Patriot Front’s rally in Washington DC,” read the first tweet after the change. A thread of Patriot Front recruitment videos was added to the original tweet with the footage.

Tweet from @SherylLewellen: This account was created with a fake name and AI generated profile in order to fool journalists into promoting Patriot Front's demonstration in Washington DC.

Twitter suspended the account soon after, but the damage was already done. A white supremacist account had successfully laundered its footage into mainstream awareness. As QAnon Anonymous host Travis View observed, “It seems as if someone associated with Patriot Front very cleverly tricked well-meaning people into amplifying their propaganda on Twitter.”

Patriot Front used to be called Vanguard America – it had to change its name after white supremacists killed Heather Heyer at the Unite the Right march in Charlottesville. ProPublica estimated its membership in the low hundreds back in 2019. Although its Telegram channel has just under 1.5k subscribers, the low attendance at the march suggests that the 2019 estimate isn’t far off. Travis View noted that the fake account inflated the number of people at the march, perhaps to make it look stronger.

“Extremist fake profiles and sock puppet accounts in online groups and message boards pushing their rhetoric is nothing new,” said Joe Ondrak, Logically’s Head of Investigations, “This is the first time I've seen an extremist group do it in this way – that is, to leverage the nascent Twitter disinfo-research community's need to be abreast of emergent events as a way of promoting said event and group.”

Ondrak notes that fascism and spectacle go hand in hand. “White supremacist groups like this always lean into a level of pageantry to get the press looking.” Marches like the infamous Charlottesville tiki-torch parade are, of course, horrifying, but the point is to horrify the public enough to get free media coverage and thus free advertising. “Extremist groups have become extremely attuned and aware of their place within the attention and media economy,” he added. 

Journalists who want to show “both sides” of extremist views accidentally give attention to a group that depends on more attention for existence.

Digital media folklorist Whitney Phillips wrote about how this dynamic played out in the 2016 election in her report titled “Oxygen of Amplification: Better Practices for Reporting on Extremists, Antagonists, and Manipulators Online.” Extremist groups need to “launder” their propaganda in mainstream outlets to make their views seem more normal — and to maximize recruitment opportunities with access to a larger mainstream audience. Journalists who want to cover juicy subjects and get their reports out first, or journalists who want to show “both sides” of extremist views, accidentally give attention to a group that depends on more attention for existence. 

“Just by showing up for work and doing their jobs as assigned, journalists covering the far-right fringe – which subsumed everything from professional conspiracy theorists to pro-Trump social media shitposters to actual Nazis – played directly into these groups’ public relations interests,” Phillips wrote. 

Plenty in the disinformation research community noted the fake account and avoided boosting it. A data scientist who goes by Conspirador Norteño on Twitter noticed the AI profile picture and tweeted about it long before the account made the switch. 

“The GAN pics have been a focus of mine for the last couple of years, and so I tend to recognize them pretty quickly when they pop up in my feed,” he told me. “The specific signals on this one were the facial feature placement and the surreal blob of nonsense in the left portion of the image.”

According to his research, if you take the “average” of dozens of images of real faces, you’ll end up with an incomprehensible blur. But if you take the “average” of dozens of AI-generated faces, you’ll end up with something legible —  the main facial features of AI images are always in the same place, so things like eyes, nose, and smile will have similar enough placements in the image that a blurred composite remains recognizably a face. 

The other detail was that, as Norteño noted, it was tweeted from the Twitter web app rather than a phone. That means that likely the person wasn’t standing there at the protest, taking live footage; they were instead broadcasting from a desktop. 

But what was unique here for him was that the account outed itself as an op so fast. “I've seen accounts that out themselves and abruptly switch identities like that before, but it usually happens via a steady effort to build trust over a longer timescale rather than less than 24 hours after the account first got noticed.”

Norteño cautions his readers. “A journalist with absolutely zero links on their profile or in their tweets to their work is weird, for example, as is the lack of other real tweets (non-retweets) prior to the video. In general, people – especially those with large numbers of followers – slowing down and checking multiple sources before amplifying dramatic content would help combat this sort of thing.”

Image credit: Bryan Olin Dozier via Reuters Connect

Related Articles