How Russia’s Troll Farm Is Changing Tactics Before the Fall Election

Ahead of November’s election, American intelligence officials and others are on high alert for mischief from Russia’s Internet Research Agency.

Remember it?

The Kremlin-backed group was identified by American authorities as having interfered in the 2016 election. At the time, Russians working for the group stole the identities of American citizens and spread incendiary messages on Facebook and other social media platforms to stoke discord on race, religion and other issues that were aimed at influencing voters.

To avoid detection, the group has since evolved its tactics. Here are five ways its methods have shifted.

When Congress released examples of Facebook ads that the Russian troll farm bought several years ago, many of the ads had misspellings and grammatical errors. Some captions in the ads omitted or misused “a” or “the” because indefinite articles aren’t used in Russian.

Now Russian operators are trying to avoid detection by copying and pasting chunks of texts from other sources directly into their posts. When Facebook took down 50 accounts linked to the Internet Research Agency in October, many of the posts featured text copied from Wikipedia, as well as from The Atlantic and other outlets, said Ben Nimmo, a researcher at Graphika who investigates disinformation.


Before

Credit…Josh Russell

Computer programs are getting better at processing vast amounts of text — a task called natural language processing — which means they are better at ferreting out telltale social media manipulation signals such as semantic errors and common hashtags.

As a result, the troll farm is now using less text in posts and fewer hashtags. In October, when Facebook removed the accounts with ties to the Russian group, researchers pointed out the group’s posts had minimal text of block letters overlaid on top of images.

Instead of writing its own text, the troll farm now also posts screenshots of tweets created by real Americans. Computer programs typically do not scan images for text.


Before

From 2014 to 2017, the troll farm ran Facebook accounts with overt pro-American, pro-black and pro-Southern culture themes. The names of the accounts mimicked brands.

Their reach was vast. One Facebook page that the group operated, Blacktivist, which focused on black activism, collected over 360,000 followers by September 2017. This surpassed the followers on the verified Black Lives Matter Facebook account, which at the time had just over 301,000.

Now, themed accounts with politically divisive content and lots of followers are considered suspicious. So the Russians appear to be working harder at hiding, using accounts that have fewer followers.

When Facebook took down some Instagram accounts that showed links to the Russian troll farm last year, more than half had fewer than 5,000 followers. One account that was removed, @progressive.voice, had just over 2,000 followers. The one with the most followers had about 20,000.


Before

One common trait among troll farm posts in the past was that its images were stamped with watermarks — a logo, text or pattern superimposed onto another image — as a way for the group to build followers for its Facebook pages.

More recently, the group has used the same images but removed the logo or blurred it out, and sometimes it changed the captions by using different typefaces. That helped to disguise that it was behind the posts, said Samantha Bradshaw, a researcher at the Oxford Internet Institute.

“Now that many of the known Russian pages have been identified, using watermarks is a double-edged sword, since it can also help content moderators track and shut down larger networks of disinformation,” she said.


Before

Now

The troll group previously created accounts directly on Facebook to influence Americans. Now it appears to be hiring local people to open social media accounts, a practice known as “franchising” that adds a layer of camouflage.

The method came to light last year when Facebook removed a disinformation operation linked to people affiliated with the troll farm that tried to sway people in Africa. In that campaign, the Russians appeared to hire individuals or local media organizations in African countries to post propaganda and false content on the social network on its behalf. In March, Facebook revealed another campaign that appeared to use the same franchising method.

Alex Stamos, a researcher at the Stanford Internet Observatory, said these campaigns had implications for the 2020 presidential election and that Russians were likely to work with Americans to get them to post politically inflammatory content on Facebook.

Source link