Report Claims Russian Disinformation Campaigns Specifically Target Right-leaning Social Media Sites
Russian disinformation operators are still quite active on social media sites, according to recent research conducted by the Stanford Internet Observatory and the social media analytics firm Graphika.
Their report, released this month, claims Russian disinformation operatives are heavily targeting right-leaning social sites because those sites have limited content moderation. Titled Bad Reputation: Suspected Russian Actors Leverage Alternative Tech Platforms in Continued Effort to Covertly Influence Right-Wing U.S. Audiences, the paper explores how Russian social media teams have been able to exploit a lack of enforcement on alternative social media platforms – in order to promote politically divisive disinformation..
Author’s of the Bad Reputation paper state, “based on the technical, behavioral, and content indicators” (which the paper outlines and explains) they believe current disinformation activity is associated with the same Russia-based group that was heavily active in 2020.
The disinformation posts reportedly are generated by a news outlet called the Newsroom for American and European Based Citizens (NAEBC). Researchers found at least 36 relatively new accounts created by that group, plus multiple older accounts that have been operating since 2020. The Russian NAEBC organization has previously been linked to Russia’s Internet Research Agency, which has been accused multiple times of supporting online propaganda efforts.
Operatives at the NAEBC group reportedly posted messages that could be seen by right-leaning viewers on social media sites including Gab, Gettr, Parler and Truth Social. Some of these sites boast they have minimal content moderation. But the downside of limited moderation is that carefully crafted disinformation and divisive content can easily appear on the sites. The Bad Reputation paper said participants on those social media sites read the false news posts, commented on them and often passed them along to others. Such actions can sometimes help false claims gain an air of legitimacy
When such posts are successful, the false news claims may be read and shared by dozens or even hundreds of other social media accounts. Often they are echoed by large social media bot networks, which Russia also has been accused of operating.
In many cases, purveyors of disinformation can achieve a larger “win” if their posts end up being quoted by legitimate (or legitimate-looking) news organizations. In that case the Russian disinformation efforts have successfully gained traction and become more widely accepted and distributed, even if the claims made in a post lack factual support.
Lack of content moderation can be an attractive marketing claim for some sites. But what starts as a bragging point can quickly become something easily exploited by bad actors. On many social media sites, some level of content moderation and filtering ends up creeping back in. If it doesn’t, the site can be further exploited by bad actors And, as the report from Graphika and the Stanford Internet Observatory shows, participants on social media sites don’t always know when they are interacting with false content. That’s what purveyors of disinformation hope will happen, and they will exploit a lack of editorial moderation wherever they find it.
Thanks for sharing. I read many of your blog posts, cool, your blog is very good.
Your point of view caught my eye and was very interesting. Thanks. I have a question for you.