Successful distribution of modern propaganda is a multi-step process
Purveyors of disinformation and propaganda have many avenues to get a false message in front of a mass audience. The process outlined here is a popular one, but by no means the only one.
This article is part of our In Depth Series – Disinformation For Hire
In this scenario, the aim is to plant a false claim somewhere online, and then encourage others to quote the claim as if it’s a legitimate fact. Often this starts with hastily constructed websites that look like legitimate news sites. Once the false news claim is in place, there usually is an effort publicize the claim. By augmenting the effort with bot networks that repeat the claim and drive traffic to the so-called “news” page, they can start making fake news look reliable.
The flow chart in Figure 1 shows how propaganda networks first gain a toehold in public consciousness by setting up many dubious news sites and flooding them with “news” stories that are actually false claims, then work to get other news sites to recognize and quote these factually deficient stories. The graphic reads from left to right, Step one to Step Four
Step 1 – Have either humans or AI tools generate news articles that have a specific political slant, or an outright false claim. The writing quality may be dubious because the content is produced quickly. The goal is to push propaganda or opinion, not to win awards. Pop-up fake news sites are often called “Pink Slime Journalism,” named after the dubious meat by-product sometimes added to ground beef.
Figure 1
(Right-click on image to expand)
Step 2 – Covers how fake social media accounts, bots, upvotes and retweet buttons can be created by the thousands and automated to make stories look more important (and linked to) than they really are.
Step 3 – Can be tricky. Organizations that generate false news work hard to be quoted by other news sites or politicians. They try to insert the propaganda narrative into other media conversations, statistics, news articles and political speeches. Gaining more mentions by others leads to a perception of legitimacy.
Step 4 – Is the desired end-result. People start believing and then quoting the propaganda. They pass it along as legitimate information. Step 4 is challenging for propaganda manufactures to achieve. But the ones who have substantial money and resources are able to make thousands of attempts, anticipating that a few of their efforts will successfully make the jump to “accepted” information. The sites that are set up to auto-generate false news may remain automated, or if they start to build an audience they can then be managed by a human to improve the appearance of long-term legitimacy.
The end-game is to use automation to overwhelm social media sites and search engines, to help a false news story appear as common as legitimate news – then to amplify that false message to the point where consumers think it’s a legitimate alternative voice.
Propagandists also have another catalyst that helps them drive home their message. They know many readers will have a confirmation bias. If the claim seems to verify suspicions or prejudices they already have, then they are more likely to believe it.
Besides the propaganda articles themselves, the majority of automated manipulation comes in the form of bots designed to boost the trajectory of online news conversations. These bots, which can be run by just a few people, are programed to do things like increase the numbers of likes, shares and comment support for the disinformation. Other bots are designed to downvote, to agree with the news or to criticize responses that disagree with the disinformation. Such efforts help game the system and dilute the opposition’s containment efforts by making it look like thousands of people support an idea even if virtually no one does.
It’s having an impact.
- The Institute for Public Relations, in its 2019 Disinformation in Society Report, said approximately 64% of Americans believe disinformation and misinformation is a major problem in the US.
- In September, 2020, the State of Connecticut, through Secretary of the State Denise hired a consultant to thwart social media disinformation campaigns aimed at the state. Other city and state governments have done the same.
In short, misinformation thrives when trust in conventional authorities can be eroded. The misinformation campaigns themselves help to undermine that trust.
Next: Part 3 – Disinformation Creators are Available for Hire, and They’re Fairly Cheap