Authoritarians Like Twitter, Too

Repressive Regimes Can and Do Use Social Media to Solidify Their Grip on Power
Kevin Munger

The Freeman
Jul. 21, 2015

In May 2014, CNN aired footage of a Ukrainian helicopter being shot by pro-Russian militants. Taken with a cell phone camera and posted on social media, the video showed compelling evidence of the scale and technological sophistication of the Ukrainian conflict.

The video was also fake — it was actually over a year old, and from Syria. CNN retracted the footage and apologized, but the “incident” was still widely discussed on Russian and Ukrainian social media.

In the wake of the Arab Spring, enthusiasm for the power of social media ran high. Nothing else had shown the same power to mobilize protestors living under repressive regimes. With information democratized, the logic ran, dissidents could outflank the centralized media control and propaganda machines so crucial to authoritarian states.

But this logic is flawed, as the faked helicopter video demonstrates. Although social media may have given tech-savvy dissidents a temporary advantage over repressive governments that were unable to keep up, Twitter and its regional analogues are now a fully mature technology.

Just like radio and television, repressive regimes can and do use social media to solidify their grips on power. As a result, the net effects of social media on the possibility of democratic revolution are at best ambiguous. They may actually be negative.

This point has been underappreciated in the enthusiasm for what social media seems to make possible. Our optimism leads us to overlook what is at stake for those in power — and their capacity to evolve new strategies using new tools. We want to believe in magic bullets, hoping that the right technological advancement will empower people to successfully rise up. But it’s at least as likely that the millions or billions of tweets sent by dissidents make them vulnerable, because they are extremely visible, while the strategic responses of government actors often go unnoticed. It’s an ironic inversion of Frédéric Bastiat’s “That Which Is Seen and That Which Is Not Seen.” Rather than people overvaluing government actions because their direct benefits mask the hidden cost borne by individual citizens, those citizens’ actions on social media allow government action to hide in their midst.

Some egregious and sophisticated uses of social media by repressive regimes have recently come to light. In a fascinating story in the New York Times Magazine, Adrian Chen explains the operations of a shady Russian “troll farm” that engages in large-scale, multiplatform acts of misinformation. At one point, they made up an explosion in a chemical plant in Louisiana, started a hashtag (#ColumbianChemicals), and relied on ordinary people to pass the story along, knowing they were unlikely to verify the details. This kind of operation, carried out on “foreign soil,” shows how seriously this Russian agency takes social media. The chemical plant explosion may simply have been an experiment, a proof of concept for what such attacks might accomplish in the future.

Their bread-and-butter social-media strategy is to pay people to pose online as regime supporters. People have acted as “sock puppets” — adopting fake personas on the Internet — since computer networks were first connected, but never at this scale, or with this degree of coordination.

Chen discusses this widespread practice in the Russian context. The existence of Chinese “50 centers” (bloggers and Weibo users paid 50 cents per pro-government post) has been known for nearly a decade. The presence of these people in online communities, voicing pro-regime sentiment, may have a profound dampening effect on protest movements.

Political scientists model the process of protest and revolution as a “coordination problem.” There are two parts to the problem: individual knowledge and common knowledge.

It makes no sense to act alone. Even if I’m completely convinced that the government is evil and needs to be overthrown, it still doesn’t make sense for me to go into the street by myself — I’ll just end up in prison, and the government will be stronger than ever.

But the main force of pro-regime sock puppetry need not be to persuade dissidents that they are wrong. All that is necessary is to confuse dissidents about what other people think. If dissidents think they are isolated, and that most other people support the regime — or even if they are merely uncertain about other peoples’ feelings — they will remain compliant. They have no way of getting accurate information about public opinion. Dissidents likely know that the people they talk to regularly are not a representative sample, and polls are either manipulated or suppressed. A horde of “50 centers” may be enough to cloak widespread resentment in a cloud of regime-supported “approval.”

And, as Duke University economics and political science professor Timur Kuran and others have argued, it’s not even enough to solve the individual knowledge problem; dissidents also must solve the common-knowledge problem. It’s not even enough for me to be convinced that everyone hates the government; unless everyone (or some threshold percentage of people) knows that everyone knows that everyone hates the government, a revolution cannot be successful.

That’s why these sock puppets and “trolls for hire” can be so powerful: they make it a lot harder to get a clear impression of what everyone else thinks, and thus whether a revolution will be successful. Because shared knowledge is so crucial to a revolution, uncertainty can be a killer.

The competition between dissidents and regimes to take advantage of new technology is constantly evolving, and no one can know what the next equilibrium will be. Hopefully, one effect of greater public awareness of repressive regimes’ online strategies will be an increased skepticism of unsubstantiated claims on social media — and an increased demand for depth in how we understand the world.
_
Kevin Munger is a third-year PhD student in the department of politics at New York University.













All original InformationLiberation articles CC 4.0



About - Privacy Policy