In your recent book Censored: Distraction and Diversion Inside China’s Great Firewall, you examine China’s approach to online censorship. What are the main tools the Chinese government uses for online censorship, and how do citizens bypass these methods?
There are a lot of different tools that the government uses to censor the internet. It blocks foreign websites with the Great Firewall. It removes individual social media posts from websites that are allowed to operate in China. It reorders search results so that there are some search results that don’t appear on a search engine. The government also instills fear, particularly in targeted individuals, such as high-profile social media users and journalists, to encourage self-censorship. In the book, I also include “flooding” as a form censorship, where the government adds large amounts of information online to distract attention away from ongoing events. All of these methods of censorship are porous and incomplete; you can get around them. For the Firewall, you can get on a VPN and go across the wall. All social media posts on a topic aren’t removed, so you could potentially find them if you spend enough time searching for them. Because of this, a lot of people have said that censorship can’t work if you can get around it. But I show in the book that for most time periods and most people, there’s not a lot of reason to spend time searching out this political information. Most people are busy and have a lot going on in their lives, so they don’t have the time to seek out political information. It’s true around the world that people are usually not that interested in politics. Because of this, small cost of access imposed by censorship can have a big effect on what information people receive.
Do we know the chain of command in deciding what to censor? For example, which agencies are involved in determining what to censor on a routine basis? Is this process highly automated?
There are many institutions involved with censorship, among the most important are the Propaganda Department and the Cyberspace Administration of China. However, censors themselves are typically hired by social media companies, and they take direction from the government on what content to censor. If social media companies do not comply, they could be subject to fines or penalties.
When I started studying this, this was not a highly automated process. Keywords were often used to flag posts, but ultimately posts were manually removed by censors. Increasingly, however, these are more automated methods of censorship; for instance, if you send a particular image in WeChat, that image might not arrive if it matches the hash of a database of what should be censored.
How do “fear, friction, and flooding” all play a role in censorship and its deniability? Are there examples of each in China?
In the book, I introduce three mechanisms of censorship. Fear is when you have an awareness of what is censored and what repercussion might come to you if you share that information or access it. The problem with fear from the perspective of the government is that people have to be aware of what’s happening for it to work. You have to know what you’re not supposed to say and what will happen if you do in order for you to be deterred. This is problematic especially in the age of the internet where almost anyone can be an information producer. So, everyone has to know what they are not supposed to know. Fear can also produce backfire effects: people don’t like being censored; they don’t like being told what not to share.
Also if fear is too effective, people won’t say things that are negative about the government, which can have its own repercussions. If the government wants to know peoples’ grievances, it won’t be able to address them if people aren’t willing to speak up.
Fear has a lot of downsides, so what we see around the world is a shift towards friction and flooding as forms of censorship. Friction increases the cost of information, making particular information more difficult to access. You don’t need to be aware that something is censored. It just needs to be more of a struggle to get to it, so you give up. This might be slowing down the internet or webpages, reordering search results, or and removing content. The other form of censorship I consider in the book is flooding, a coordinated effort to spread information in order to distract from ongoing events. We see this with online armies that coordinate their efforts in order to crowd out other information online.
Due to the massive amount of information on the internet, you characterize censorship as, not a ban, but a “tax on information.” Why has the Chinese government chosen to slow down access rather than stop it, a less severe approach to controlling information?
There are lots of costs to censorship from the government’s perspective. We have seen instances where a lot of the flow of information is curtailed, such as in North Korea. It’s difficult to get information out or in of the country, but then again, North Korea doesn’t have a lot of economic ties with the outside world. When you have ties to the outside world, like business, trade, academic exchange, travel, it’s very difficult to keep information completely outside of the country. Because of this, there is a costly economic trade-off with censorship. Governments also prioritize information flow because it helps them function, especially in a decentralized country like China where the local governments have a lot of power. For the central government, information needs to flow up to the center, so you can receive accurate reports, identify corruption and improve governance. Instead of suppressing all information, a tax on information in the form of friction or flooding allows people to access information that they really want, but still discourages most people from accessing censored information.
What is the “dictator’s dilemma”?
There are two forms of the dictator’s dilemma. One is that too much repression can backfire. The second is that too much repression can decrease information that the government has access to about the population. Friction and flooding, again, are solutions to these dilemmas because there’s still informational flow from the localities to the center and because they reduce awareness of censorship.
How have Chinese citizens reacted to both observable and porous censorship?
Observable censorship is what people are aware of, such as fear. The strategy of the government is to target fear toward opinion-leaders, journalists, and people who can have a large impact. However, for many people observable censorship will backfire, causing more anger toward the government and censorship apparatus. That’s why so many of the government’s censorship methods are porous. Porous censorship allows for people who don’t have a lot of time to search for information to be blocked, often without realizing it. Porous censorship can be explained away, creating plausible deniability. Is the internet slow because of technical issues or because the government is slowing it down?
However, porous censorship has some downsides from the perspective of the government. The strategy of porous censorship tends to fail during a crisis or censorship of entertainment. In the case of a crisis, people tend to be more willing to look deeper for information if it impacts their own safety. Censorship of entertainment can also cause circumvention because people are often willing to spend time seeking out entertainment. For instance, we show that the censorship of Instagram in China saw a spike in the rise of VPNs downloaded because people wanted to access their social media, which became a gateway to Twitter, YouTube, Wikipedia, and other sources of information.
One success of China’s internet censorship is the prevention of anti-regime collective action. Does censorship also succeed in making the public less resentful toward the government by restricting their access to unfavorable information about the government’s performance?
When I first started my dissertation research, I wanted to know the effect of censorship on collective action. What I ended up writing in the book is the effect of censorship on access to information. I’m not able to say much on the effect of censorship on action or opinion. It’s very difficult to assess that. There are two studies that I think do this well. One looks at changing the curriculum of studies, and that resulted in an effect on ideology. The other looked at giving people a VPN and found that there was an impact on political ideology later on.
Your book was written and published in 2018, about 1.5 years before the emergence of COVID-19. How do your findings apply to the Chinese treatment of the coronavirus pandemic both within their borders and internationally?
This is difficult to address because the situation is still so new. We don’t have much data on it yet. One thing we do see is an increase in censorship circumvention with COVID-19, which is consistent with the issue of porous censorship in a crisis. It’s very difficult to disentangle the effects of propaganda and censorship on opinion with the government’s performance in containing COVID-19 in China. We’ve done some online surveys in China, and we’ve actually seen an increase in trust in the government in China. It’s difficult to say exactly why, but my guess is because China has controlled the virus effectively, particularly in comparison to the rest of the world.
Book image from Margaret E. Roberts