China’s Fake Twitter Accounts Are Tweeting Into the Void

On Dec. 2, Twitter announced the removal of two Chinese state-linked influence operations: 2,048 accounts that boosted Chinese Communist Party (CCP) narratives about the Xinjiang region and Uyghur population there, and 112 accounts attributed to Changyu Culture, a private company acting on behalf of the Xinjiang regional government.

Our team at the Stanford Internet Observatory analyzed these two networks. We found that both networks amplified pro-CCP narratives about the treatment of Uyghurs in Xinjiang, often posting content from Chinese state media or sharing first-person Uyghur testimonial videos about how great their life is in the province.

As with past Twitter takedowns of pro-CCP networks, accounts in the first network were thinly veiled: Rather than presenting the account holders as plausible real people, they often featured default or stock profile images, only occasionally contained a bio, and showed little history of posting content that predated the topic of the operation.

They also had few or no followers and received minimal or no engagement. The first dataset included 31,269 tweets, over 97 percent of which had zero engagements (the sum of likes, replies, retweets, and quote tweets). Many other pro-CCP campaigns, including from 2019 and 2020, were similarly lacking: In the 2020 takedown, the average engagement per tweet was just 0.81 (less than one like, comment, or share) and the average engagement per account was 23.07.

Indeed, one of the most notable things about these networks—and pro-CCP operations on Western social media writ large—is how tactically repetitive, persistent, and yet low engagement they are. Even in the few weeks after Twitter removed the specific accounts we examined, we observed hundreds of accounts with similar profiles and posting patterns. Other researchers noted similar patterns—thousands more accounts, yet distinct from the networks we analyzed.

Why replicate the same strategies, year after year, when they don’t appear to be achieving any real degree of virality? What explains why pro-CCP operations are so frequent yet have such low engagement?

Here are three possible explanations:

1. Social platform takedowns may be limiting pro-CCP campaigns’ growth

Each time an operation is removed, operators must start over and build up new followings. If propagandists expect their accounts to be removed quickly, they may decide persona development is not worth their time and that flooding the zone (or a particular hashtag) with a high volume of contributions, attempting simple distraction, is a more optimal strategy. The lack of followers or engagement, then, could be a sign platform defenses are effective.

However, a more pessimistic reading is possible. One of the assumed logics of publicly announcing the removal of influence operations from platforms—making almost monthly announcements that media pick up on—is that these announcements have a deterrent effect. Members of Facebook’s security team, for instance, have written that “there is reputational cost in being publicly labeled and taken down for foreign or domestic [influence operations].” The frequent nature of pro-CCP influence campaigns—despite their low engagement—could be seen as evidence that takedowns are in fact failing to deter pro-CCP actors. They simply keep re-spawning.

As it stands, we don’t have a lot of research to show that political actors do, in fact, face reputational damage sufficient to change their behavior. The fact that the same operators—such as Russia’s Internet Research Agency (IRA) and CCP operators—continue to wage campaigns despite whack-a-mole attempts to take them down may suggest that the deterrent effect is not working, or that the perpetrator believes the cost is worth the benefit.

The fact that pro-CCP networks continue to reemerge after they are repeatedly removed could be a sign that platform takedowns alone are insufficient to crowd out all influence campaigns, or that some level of influence campaigns will inevitably persist on a platform where account creation is relatively effortless.

2. CCP metrics and organizational behavior may incentivize low-engagement operations

Organizations need indicators for how to evaluate employees and their own success. Research in the international security field has shown that when departments or agencies in the same government use different indicators to assess their results, they can come to vastly different conclusions about whether they are succeeding or failing. These indicators also set incentives: If employees know the criteria on which they’re evaluated, they may shape their behavior accordingly.

Measuring the impact of influence operations is notoriously challenging. It’s often difficult to trace people’s real-world attitudinal and behavioral changes from viewing social media content and to distinguish the effect of influence operations from people’s preexisting beliefs and other factors. It could be that influence operation employees are evaluated on the number of posts or accounts they run, not the number of engagements those posts or accounts receive.

For example, research on China’s domestic social media propaganda has shown that the government uses the so-called “50c party” (internet commenters paid to post at the direction of the government while pretending to be ordinary social media users) to fabricate hundreds of millions of social media posts, not to argue with critics of the party but rather to “distract the public and change the subject.” The high-quantity, low-engagement campaigns on Twitter might be understood as a foreign variant of this domestic strategy.

This is decidedly different from researcher observations that accounts run by Russia’s IRA changed their names and focus over their operational period, appearing to explore topics and personalities in an effort to find which ones were optimal for a high-engagement audience. The IRA account “Army of Jesus,” which would go on to become one of the most-followed of the IRA’s 2015-18 operational timeframe, began its social media life as a meme page devoted first to Kermit the Frog, and then to The Simpsons.

If engagement were not the metric of focus, we would expect propagandists to invest in quantity over quality—focusing on mass-producing accounts and tweets with the intended message rather than building up followers or even efficiently promoting accounts within the network. Pro-CCP influence campaigns may deploy lots of accounts and re-spawn after those accounts are removed because the operators of the campaigns are paid based on posts, not persistence.

A variation on this theme may be that the CCP hires a collection of distinct operators to run these highly similar campaigns, each of which has to start from scratch, or create accounts for a specific operation (or in response to a specific vendor request). If operators are paid for specific campaigns and build their inauthentic networks based on those contract requests, we would expect to see new networks that do not have substantial followings appear after contracts. Moreover, if the CCP is contracting foreign propaganda campaigns to a variety of vendors, detecting one operation is unlikely to get rid of others.

We recently learned that the CCP is, indeed, outsourcing. On Dec. 2, for the first time, Twitter attributed an influence operation to an independent organization in China: Changyu Culture, a private production firm Twitter said is “backed by the local Xinjiang regional government.” Likewise, Facebook recently attributed an influence operation to a private firm in China, Sichuan Silence Information Technology Co, Ltd (an information security company).

These recent attributions to outsourced organizations in China may be the tip of the iceberg—not the first networks the CCP has outsourced, just the first Twitter and Facebook have caught and publicly announced. After all, researchers in the disinformation field have found a rise in state actors outsourcing their disinformation campaigns to public relations or marketing firms.

But outsourcing need not produce low engagement effort. Several outsourced operations in the Middle East and North Africa have shown high follower engagements; more than 13.7 million people followed the Facebook pages of an operation attributed to marketing firms in Egypt and the United Arab Emirates.

We’ve also seen other states’ foreign propaganda strategies evolve, such as Russian operations going from in-house (run by Russian military intelligence), to outsourced domestically (to the IRA), to run through third-party countries, to hiring unwitting citizens in the target country. As applied to the CCP, it may be that old contracts were based on the number of posts or accounts run, but future ones will be based on audience engagement, authentic follower count, or longevity on the platform—more typical metrics social media marketing experts use that indicate growth of reach or influence.

3. The CCP may not care much for, or be very good at, Twitter astroturfing (yet)

Occam’s razor might suggest a simpler reading: The CCP just may not be very good at covert Twitter persuasion campaigns yet. Using fake accounts to astroturf—or to create the illusion of widespread popularity of a particular viewpoint—is just one propaganda tactic. The CCP has a broad span of international/outward facing propaganda capabilities that span the broadcast, print, and digital spheres, developed over decades. China Daily, for example, places paid inserts in other newspapers, and China Global Television Network operates regionalized bureaus in multiple languages.

On Twitter, the CCP may believe that investing in “Wolf Warrior” diplomat accounts may be more effective than covert persuasive campaigns. Or, they may blend overt and covert lines, using fake accounts to amplify the Wolf Warriors to give the impression of outsized public support. Research from the Associated Press and the Oxford Internet Institute has shown that accounts that amplify Chinese diplomats are often later suspended by Twitter for platform manipulation (a sign of inauthenticity). The CCP opting to prioritize other channels—such as leveraging YouTube influencers—could help explain the relatively modest reach of these covert Twitter personas.

If this is the case, it may be worrying. The CCP’s vast array of resources and relatively cheap workforce mean that incremental changes in strategy could produce significant shifts in the overall disinformation landscape. If pro-CCP influence campaigns become more sophisticated, social media platforms may have a much bigger challenge on their hands.

When making judgments about covert activity in real time, we must be cautious. We may be susceptible to the streetlight effect, sometimes called the drunkard’s fallacy, which is the tendency to search for data where it is most readily available. Judging adversarial efforts’ competence solely on platform takedowns could lead to systematic bias.

It is plausible that the Xinjiang-related operations announced in recent Twitter takedowns were caught because of how sloppy they are. One Chinese government-linked network Twitter took down on Dec. 2 used identical text (or “copypasta”) followed by random strings of four capital letters or snippets of code. The New York Times-ProPublica investigation of a subset of these accounts prior to their takedown suggested this may have been evidence the tweets were posted sloppily by computer software.

If the networks that have been discovered were found because of how sloppy they are, then more sophisticated Chinese influence campaigns may exist as well. The respawn dynamic open-source investigators have observed may be specific to the low end of the quality spectrum. A healthy dose of modesty is required when making any broader claims or assessments about Chinese influence operations after studying these takedown sets alone.

Still, these takedowns provide a worthwhile reminder that effort does not equal impact. The CCP may be running thousands of fake accounts to promote the state’s Xinjiang messaging to English-speaking audiences on Twitter, but, barring additional evidence, it’s unclear if those English-speaking audiences are buying the messaging at all.

Josh A. Goldstein is a Center for International Security and Cooperation postdoctoral fellow at the Stanford Internet Observatory. Twitter: @JoshAGoldstein

Renée DiResta is the technical research manager at the Stanford Internet Observatory. Twitter: @noUpside

SAKHRI Mohamed
SAKHRI Mohamed

I hold a Bachelor's degree in Political Science and International Relations in addition to a Master's degree in International Security Studies. Alongside this, I have a passion for web development. During my studies, I acquired a strong understanding of fundamental political concepts and theories in international relations, security studies, and strategic studies.

Articles: 14402

Leave a Reply

Your email address will not be published. Required fields are marked *