With the 2020 US Presidential Election looming, Facebook’s efforts to remove coordinated influence activity are becoming even more important, and no doubt many will be watching the platform’s monthly Coordinated Inauthentic Behavior reports to spot any signs of potential concern ahead of the November poll.
This week, Facebook has released its August update of malicious accounts that it has detected and removed. And a familiar name has once again cropped up on the list.
As per Facebook:
“Since 2017, we have removed over 100 networks worldwide for engaging in coordinated inauthentic behavior, including ahead of major democratic elections. The first network we took down was linked to the Russian Internet Research Agency (IRA), and so was the 100th we took down in August. In total, our team has found and removed about a dozen deceptive campaigns connected to individuals associated with the IRA.”
The Kremlin-linked Russian Internet Research Agency, as you may recall, was the key player identified in efforts to manipulate US voters ahead of the 2016 election, which may well have impacted the final result. We don’t know, of course, what level of influence its efforts actually had, but the aim of the IRA ahead of the 2016 poll appeared to be focused on sowing division among American voters in the hopes of pushing them towards voting based on those pain points.
Facebook has since implemented a range of measures to better detect and remove such efforts before they can take hold, which should limit the same in 2020:
“Over the last three years, we have detected these efforts earlier and earlier in their operation, often stopping them before they were able to build their audience. With each takedown, threat actors lose their infrastructure across many platforms, forcing them to adjust their techniques, and further reducing their ability to reconstitute and gain traction.”
So the question of Russian influence should be less significant this time around, and definitely, based on Facebooks CIB reports over the last six months, there’s no definitive upward trend in detection and removal across its platforms.
As you can see here, month-by-month, we haven’t seen a significant ramp-up in Facebook’s removal actions leading into the election. At one stage it looked like Instagram was getting more focus, but that’s tailed off in this latest report, while groups are the only section which saw an increase in August.
The detected IRA cluster, for context, was also relatively small, encompassing 13 Facebook accounts and two Pages, in total – though Facebook notes that it was only in the initial stages of establishing its presence. The network was focused on building a presence “on the left of the political spectrum”.
Which is good, right? That means that Facebook is seeing less manipulative actions, which ideally suggests that it’s getting better at removing such, and better protecting democratic process in all regions. Or those building such campaigns are getting better at hiding them. It’s impossible to know, but at this stage, based on these findings, it looks like Facebook’s efforts to detect such are paying off.
Which could ensure that the results of the 2020 Election are not mired in questions around foreign interference.
Of course, there are still queries about the advertising tactics employed by politicians, with Facebook refusing to submit political ads to fact-checking. There are also concerns around manipulative ads – just this week, the Trump campaign has had several videos removed on Twitter because they were edited or manipulated to misrepresent their political opponents.
There are still domestic misinformation concerns to address, but it seems, at least at this stage, that Facebook’s efforts to stop foreign interference are working.
It’s early in the real campaign push, so we’ll have to wait and see. But these appear to be encouraging signs.
Other clusters detected in August included a group of US-based accounts which were focused on activities in Venezuela, Mexico and Bolivia, and a Pakistan-based group which was focused on users in Pakistan and India.
You can read Facebook’s latest CIB update here.