Part 3 – Disinformation Creators are Available for Hire, and They’re Fairly Cheap

In today’s global environment of disinformation, there are multiple organizations for hire who are willing to act as partners, conduits, and promoters of false information. Some of them do this work because of specific political allegiances, some are consultants for hire, and others participate with no agenda, and instead create and host controversial news simply to boost page views and advertising revenue.

In our previous article, titled How False Information Gets Promoted to Quasi-legitimate News we mentioned the rise of so-called pink-slime journalism, which originally consisted of hired content creators writing dozens of online articles to support specific orientations or political slants. These pink slime “news” sites still exist, but creators of misinformation also fall into other categories.

This article is part of our In Depth Series – Disinformation For Hire 

Key players – four main camps.

  • State-run operations. Examples include –
    • The huge social media company Meta Platforms removed (from Facebook and Instagram) hundreds of accounts connected to the Royal Malaysia Police because they were being used to manipulate public discourse and spread poorly documented claims about the Malaysian police and the government.
    • In Saudi Arabia, the King’s Brigade, which tends to be supportive of the Saudi Royal family,
    • The Alexander Lukashenko Government in Belarus, which has been observed spreading disinformation to support government rule.
  • Commercial/freelance contractors. Examples include –
    • The marketing company Rally Forge, which was banned from Facebook due to concerns their social media work was pushing a false narrative on behalf of Turning Point USA, a pro-Trump youth group.
    • Canada-based Estraterra, which advertises “strategic political communications” was similarly banned for campaigns targeting news and social media in several South American Countries.
  • Third party entities that are sponsored or subsidized by governments. Examples include –
    • The CyberBerkut, a pro-Russian group of hacktivists with ties to a disbanded police force.
    • The Internet Research Agency, which runs large-scall online influence operations that favor Russian political interest and some business interests.
  • Major media outlets in some countries are, themselves, key propaganda conduits. Examples –
    • Rossiya in Russia covers 98.5% of the country’s territory and is state-owned.
    • North Korea has a Dozen large newspapers and even more magazines. All are published in Pyongyang and all have a high level of government control and/or censorship.

Today, today, there are freelance media sites that advertise services where trusted and approved Wikipedia page editors can be tapped to skillfully write custom pages that are able to stand up to the scrutiny of the encyclopedia’s editors, ( but which still mask an information bias and slanted messaging.

Bot developers of all types (not specifically for social media posts) can be hired through job sites like Upwork, Arc, Guru.com. Likewise, basic Web developers can be hired to spin up web sites that look like well-established news sites for under $50 apiece. In some cases, the developers simply build empty shell platforms that can be used for a range of marketing purposes. The sell the platforms to anyone who wants a news site and the creators may not know their work is being used for illicit social media manipulation. There also are commercially available social media message bots, originally created for marketers to interact with customers. But these same bots can be tweaked to deliver political messages.

Figures 2 and 3 show two examples of click farms based in Asia.

A group called Insta clicks most likely operated Kazakhstan or Armenia, to create fake profiles and engagement on Instagram.

Many click farms are established to sell traffic to advertisements or to provide upvotes on ecommerce sites. These businesses found a new market in providing traffic to fake news sites or by upvoting or sharing political disinformation on social media. Services may start with manual clicks. The clicks can be automated and ramped up to the tens of thousands for large-scale purchases.

Figure 2

Police Inspect an alleged click farm reported in Malaysia. (Police photo.)

Figure 3

A still frame from a video that claims to show a click farm in China. (Ownership not claimed. Released broadly on social media).

 

Examples of Disinformation as a Service

Because their activity may be illegal, individuals who generate false information or host false news sites (or bot nets) often work under pseudonyms, and they may change names often. Business is generated by word of mouth for select customers. We were able to find a few examples.

In early 2023 a network of bot-farms with over 30,000 systems was discovered in Ternopil, Ukraine. The accounts were used to share pro-Russian narratives and disinformation related to Ukrainian politics. It was one of multiple similar operations discovered in the past three years

In 2020, Peng Kuan Chin, of Bravo-Idea (an Asian communications company) gave media interviews boasting about the thousands of AI generated news sites he operates. His tools are4 able to create news stories using snippets of other stories. The stories can be rearranged and regenerated for use on other sites and search engines may not realize it’s the same content. Today, advanced AI solutiojns such as ChatGPT can be used the same way, though ChatGPT was not specifically created for that purpose.

Israel’s Archimedes Group promised it could “change reality according to our client’s wishes.” The company eventually was banned from Facebook for creating hundreds of fake social media accounts that were used to influence elections in multiple countries.

Facebook also banned Philippines-based Twinmark Media Enterprises, and its associated entities, for “coordinated inauthentic behavior.” Twinmark was accused of using Google Ads to drive traffic to specific news pages.

In the past two years we have found nine places where bot traffic or pop-up news sites could be purchased and launched within 24 hours. But we will not name the names for two reasons.

  • We don’t want to publicize and drive potential customers to such sites.
  • Several of the sites disappeared within a matter of weeks. If we listed them here, the links could be dead quickly. Such sites thrive by being nimble. They make some quick sales then move on to other locations and new business names before they can be held accountable.

Outside of fake news, there are other services for hire that specialize in the manipulation of public perception. For example, there’s a company in Los Angeles called “Crowds on Demand” that has been in operation since 2012. They will send sign-carrying admirers or protesters to any event you want.

A Fake News Hub

Sometimes the false information is just about money, not political gain. In 2017, the town of Veles, Macedonia was mentioned by Barack Obama and also featured in Wired magazine for hosting hundreds of fake news Web sites. These were set up specifically to serve “news” pages to people who followed the news links the group posted on social media. At first, the systems hosted within the town did not start out pushing any specific political agenda.

But in the process of posting articles stolen from English-language news sites, they found that controversial articles from extremist sites created more click-throughs and engagement. So, they started focusing on those, posting articles from multiple parts of the political spectrum. They only wanted the Web traffic so they could also serve ads.

Accountability is Tough When You Can’t follow the Money

In the U.S., news sites owned or controlled by “a political party, committee or candidate and are acting as a media outlet” must comply with all federal election laws and regulations. But some pop-up news sites have learned to sidestep this. The challenge today is that money can flow through overseas entities, and payments to propaganda generators can be handled through crypto currency and various third parties. This has muddied the waters, making it easier for political interests to boost partisan news.

The result is the erosion of government trust by social media influencers. Just as the success of clothing or electronics brands can rise and fall via hired online influencers, political interest groups have learned those same ploys. Government agencies often see this happening, but lack the marketing savvy to neutralize these challenges.

“Propaganda engineering” managers also use data analytics. Being able to measure reactions to propaganda campaigns allows them to further manipulate their messaging.

The issue is important enough that the U.S. Cybersecurity & Infrastructure Security Agency has developed a task force to study the issue of fake news and propaganda, with a particular eye on foreign influence.

Next: Part 4 – Key Mitigation Step – Using AI to Recognize and Flag Disinformation