YouTube Bans Over 2500 Propaganda Accounts

The accounts in question are related to wider coordinated influence operations investigated by Graphika

YouTube has banned over 2000 accounts related to “coordinated influence operations” in the second quarter of this year alone (that’s up from 277 in the first quarter). 

In a Threat Analysis Group (TAG) bulletin for Q2, Google detailed that, over the course of three months, they terminated 21 accounts (including one AdSense account and one advertising account) related to two separate investigations into coordinated influence operations in Iran and 90 accounts relating to several ongoing investigations into Russian operations. Most of the bans – 2596 to be exact – were associated with coordinated influence operations related to Chinese propaganda.

A number of the investigations are supported by Graphika, who published a report about the Chinese accounts particularly called Return of the (Spamouflage) Dragon: Pro Chinese Spam Network Tries Again. The report claims that these accounts are part of a resurgence of a former propaganda campaign which was taken down at the end of September 2019. 

“Some assets evaded enforcement in September 2019, either because the platforms did not detect them or perhaps because they found that the assets did not violate specific policies,” states the report. “Other assets were newly created in early 2020, and still others appear to have been obtained and repurposed for pro-Chinese messaging needs”.

They also noted that when the accounts were removed in 2019, they largely focussed on  “praising the Chinese authorities and attacking the Hong Kong protesters, as well as attacking exiled Chinese billionaire and regime critic Guo Wengui (郭文贵,also known as “Miles Kwok”)”. However, following the international pressure which was growing in light of the Chinese government’s handling of Covid-19, “the spamouflage dragon” came out of hiding, activating dormant accounts and creating new ones with a new focus of praising China’s response to the pandemic. 

The accounts are dispersed over multiple platforms – YouTube, Facebook and Twitter – and all feature similar content (mostly spam of scenery, models, basketball and TikTok videos with propaganda interspersed throughout) which points to coordination. However, as not all accounts post about the same issues there does appear to be a level of autonomy, so it is not yet clear whether this operation is run by one large organization or is the result of “a cooperative endeavor” between small groups. 

Don’t believe everything you see on the internet, kids!