Chen Lijia

Survival Comes First

Mar 18, 2025
Mar 18, 2025
Xiao Wang
Chen Lijia

Xiao Wang

Wang Xiao (王晓) is a freelance contributor to the Chinese diaspora outlet Mang Mang (莽莽), an independent magazine based in Europe that was born out of the wave of resistance in China and globally in 2022 that became known as the White Paper Movement (白紙運動). Founded by a group of young Chinese expatriates, Mang Mang focuses on the Chinese community abroad — writing about activism, resistance, connection, history, and identity.
A Gen Z content moderator working for a major Chinese internet platform reveals the moral contradictions of deleting sensitive political content while struggling to make ends meet. “We’re just tools,” he says in this candid interview from independent media outlet Mang Mang. Translation of this interview was done by CMP in cooperation with ChinaFile and China Digital Times.

Chen Lijia

Born in 1997, Chen Lijia (a pseudonym) became an internet content auditor for a well-known Chinese internet company in 2020. The company’s user base has surpassed one billion, and all content generated by these one billion users must all pass through the hands of the company’s massive content moderation team — of which Chen Lijia is but one member.

Generation Z has now become the primary force among China’s growing ranks of China’s online content moderators, who number in the tens of thousands. Their physical stamina means they generally fare better with the intense demands of the job and can stay up late to respond quickly to issues with sensitive content. They now handle the bulk of the work for major internet platforms when it comes to content moderation.

Also known as the “Internet Generation,” Generation Z also includes those born after 1995, who from birth have known only a world with the internet, which arrived in China in 1994. They are not just familiar with the internet, mobile technology, and smart applications, but have also grown up alongside the Great Firewall, which since 1996 has cut them off from the global internet.

Data shows that ByteDance, now firmly established as China’s leading internet company, employs more than 100,000 people, of which more than 20 percent work as content moderators. At Bilibili, a popular video-sharing platform that targets younger audiences, this proportion exceeds 27 percent.

Headquartered in Beijing, ByteDance Ltd. employs more than 100,000 people, of which nearly one-fifth are content moderators. SOURCE: Bytedance.

China, with more than one billion internet users, represents 19 percent of global users of the internet. However, even as the number of content moderators has been on the rise, the number of Chinese web pages has fallen by 70 percent in the last ten years, and the number of Chinese websites has declined by 30 percent in the last five years. While hundreds of millions of Chinese internet users divide their time among dominant platforms like Baidu, Weibo, WeChat, and Douyin, strict content censorship has effectively confined them within an information firewall of “China-exclusive” (中国特供) internet content.

In creating this situation, online content moderators, who serve as the ultimate enforcers of state policy, have played an indispensable role.

Born in 1997, Chen Lijia in 2020 became a content moderator at a prominent mainland Chinese internet company that operates the country’s largest search engine. The company’s founder has publicly stated that its user base exceeds one billion. And all search queries and content published by these one billion users must pass through the massive moderation team that includes Chen. Among the various types of content moderation, politically sensitive material receives the highest priority.

Chen Lijia is candid about his reasons for becoming a content moderator. He simply wants to earn a living. But Chen and his colleagues have spent their entire lives within the confines of China’s Great Firewall. So how can they effectively judge which information should be censored?

“The company holds regular training sessions to tell us what kind of information we need to remove,” Chen says. “But they never explain why. We’re just treated as tools.”

It is precisely because he understands his position as a “tool” that Chen sees his work as unrelated to justice or morality — but merely about survival. Facing a deteriorating economy and an increasingly competitive society, this is how many Gen-Z content moderators genuinely feel as they routinely delete posts and suspend accounts containing politically incorrect content.

What Do Content Moderators Do?

Q: What kind of work do content moderators at your company mainly do?

A: Mainly image and text moderation (图文审核), the content dealing mostly with three areas —political security, violence, and pornography. Beyond these, we also need to moderate content here and there related to gambling, drugs, and illegal advertisements.

Q: What kind of content is considered illegal advertising?

A: We mainly remove advertisements from companies that don’t have partnerships with us. Advertising is one of the main sources of revenue for our company, and if we don’t control these unauthorized ads, the company can’t make money. That’s why content moderators receive a whitelist from the company, and any advertisements not on that list need to be removed.

Q: Among the content that needs moderation, which category has the highest priority?

A: Political security, definitely.

Q: What are the standards for reviewing political content? For example, what type of content passes review and what doesn’t?

A: The standards are actually similar to those used in domestic news reporting. Any content that can’t be reported in the news media is content that can’t pass our review. The most obvious examples are content related to high-ranking political officials and their families, or content about the Communist Party’s negative historical events. We absolutely must remove all of that stuff. Every year around June 4, people share information about the June Fourth Incident, and we have to remove all of it. As June 4 approaches, we even have to work overtime to manage everything.

Q: So during special time periods like June 4 and National Day on October 1, you moderators need to be extra vigilant, right?

A: Absolutely! Because during these time periods, there are always people who for whatever reason want to refresh public memory. As these dates approach, they like to post different things. From our perspective, June 4 has essentially become an unofficial folk holiday (民间节日). During these sensitive periods, the review rules and strategies temporarily change. Normally we use a “post first, review later” approach, but during sensitive periods, it switches to “review first, then post.”

Q: Besides changing from “post first, review later” to “review first, then post,” are there any other adjustments to strategies or rules?

A: No one has ever explicitly explained these things to us. Content moderators are just the implementation level. In my years at the company, they (the full-time employees) have never allowed me access to their moderation policies. We simply have to do whatever they tell us to do.

Q: Are there changes at the implementation level during those times?

A: Things just become stricter and more intense. Everyone has to work overtime then, and we also have to work in rotating shifts, including overnight shifts.

Q: Does the company explicitly tell you what content must be removed?

A: They don’t state it explicitly. They just give us vague examples. They tell us to be vigilant and ensure certain content doesn’t get through. But they never clearly explain to us exactly what these things are about.

Q: What sort of things did they test for when you interviewed for the political content moderator position?

A: They would introduce certain sensitive political information, and then ask me how I’d handle it. I’m pretty familiar with such things, and I even knew more about them than the examiner. On the June Fourth Incident, for example, after a little training, everyone knew what it was. But when there were terms like “Tiananmen bullet holes” (天安门子弹孔) — aside from me, no one else was aware of them. This is what sets apart those capable of doing this kind of work. It’s crucial to be familiar with the vocabularies that derive from such incidents — because many people use things like symbols and homophones to disguise what they’re saying. For example, the term “public square defender” (广场卫士) is an expression that comes from June Fourth. I’m able to discern it with just a glance. Any good content moderator must possess this degree of political sensitivity.

Q: Is the “People’s Liberation Army” something that can be mentioned?

A: Around sensitive dates, it’s totally forbidden. Even comments praising the government and the PLA for restoring order are off limits. The point of this censorship is not to drive public opinion in any certain direction. It’s beside the point whether something is positive or negative. The point is for censors to obliterate the event so that the public completely forgets about it.

Q: What other types of content do they bring up during training?

A: The Tiananmen bullet hole thing — they didn’t need to train me on that. I’ve known about that for quite some time. Even my bosses weren’t aware of it until I spread the word around. I’m really diligent about my job.

Q: So those who can get over the Great Firewall and have a better understanding of politics can perform political content moderation better than others.

A: Of course. So I could advance as a quality inspector faster than others.

Q: How many training sessions have you taken part in with your current employer?

A: Maybe seven or so. The one that made the deepest impression was on labor camps in Xinjiang. The sessions were totally shallow, and they really taught you nothing. The company’s point was just to make you aware — that was it. As for any of the specifics, we didn’t need to remember those. I think the company really preferred that we forget everything after the training.

Q: They don’t tell you why certain things are politically sensitive? Do you ever look into it yourself out of curiosity?

A: The company doesn’t need us to know anything in too much detail. In fact, AI will already have blocked out a lot of content through keyword filtering — and that content might have been much more detailed. But even we content moderators can’t see that stuff. I’m not someone who would look into it, because I’ve understood it for a long time. For things I don’t understand, I’ll scale the wall and check it out on the foreign internet. When you understand more, sometimes during the manual review process, I discover certain keywords, and then I report them to leadership. Afterward, these keywords get added to the computer program, and then the AI conducts automatic review and blocking.

The one that made the deepest impression was on labor camps in Xinjiang . . . . I think the company really preferred that we forget everything after the training.

Q: What keywords have you reported up the chain of command?

A: For example, “egg hole” (蛋孔) — that’s “egg” as in a chicken egg — and “dan fried rice” (旦炒饭), that’s the “dan” as in “Satan,”, and so on. Also combinations of the characters for “new” and “frontier” [as in the name of “Xinjiang”] with various homophones. There are many of them, but they’re all quite trivial. [NOTE: These are coded references mocking the death of Mao Zedong’s eldest son, Mao Anying, during the Korean War. According to internet lore, Mao’s son was killed by American bombing after revealing his position by cooking egg fried rice on the battlefield.]

Q: How did they conduct the training about the labor concentration camps in Xinjiang?

A: They just had us watch a documentary. After we finished watching, that was it — nothing else was said. The company’s goal wasn’t to teach us the truth but to make us delete content. Different intentions lead to different approaches. They just wanted reviewers like us, mere cogs in the machine, to recognize this issue as sensitive and simply delete relevant keywords whenever we encountered them. They didn’t care about our thoughts or feelings on the matter.

Q: So you guys just need to know that “Xinjiang” and “concentration camps” are sensitive terms, and then as long as you delete them immediately upon seeing them that’s enough.

A: No. The term “concentration camp” isn’t sensitive on its own. Only when it’s combined with “Xinjiang” does it become sensitive. That’s exactly why human review is necessary. Machines can only moderate obvious content, while humans can detect subtleties.

For example, my bosses and I once had a dispute about content asking, “Why didn’t Chiang Kai-shek assassinate Mao Zedong when he was in Chongqing?” I thought this sentence was normal, but our leaders demanded its deletion. I resisted but eventually deleted it without understanding why. Later I realized that Mao Zedong is considered a great person — so how could anyone suggest assassinating him? From the state’s perspective, that statement is clearly politically incorrect. So to be effective human reviewers rather than machines, we need to better understand such implied meanings.

Q: What kind of content have you all been trained on?

A: We’ve had training about evil cults, like Eastern Lightning. But it was just a PowerPoint presentation. No photos were permitted. As soon as it finished, the company quickly took everything away. All of the company’s training is secretive like that. My impression is that they need us to know about these things to do our job properly, but at the same time, they’re afraid of us knowing too much. They’d prefer we forget everything after executing their instructions. It’s really messed up.

Q: In terms of evil cults, what content had you already been aware of beforehand?

A: I really didn’t know about this before. And it was precisely because I wasn’t aware and didn’t understand that I didn’t feel any guilt about deleting posts. Even when deleting content about June Fourth, I felt no guilt. These things just seemed so remote to me.

Guilt and Morality

Q: Is there anything that gives rise to a feeling of guilt for you?

A: Yes. For example, content about the Covid epidemic, and about the floods in Zhengzhou. There was also this piece called “Ten Days in Chang’an” (长安十日), written by someone called Jiang Xue (江雪). Deleting those things made me feel guilty. But these were very obvious things [that demanded deletion]. If I didn’t delete them, someone else would have. They were too blatant. I couldn’t have let them pass even if I wanted to. [NOTE: Jiang Xue is a well-known non-fiction writer and former journalist from Xi’an.]

Food packets being delivered to residents in the city of Xi’an under quarantine in 2020. SOURCE: China Digital Times.

Q: How does your sense of guilt manifest?

A: It makes me want to quit. But people around me console me by saying that if I hadn’t censored it, someone else would have. My friends also ask what I would do instead. Everyone knows survival comes first. Morality only exists after survival is taken care of.

Even so, 2021 was an extremely difficult year for me. The Xi’an incident and the Zhengzhou floods happened that year. That’s when my guilt peaked. I really wanted to leave the company and never work as a political censor again, but I couldn’t find any good opportunities. So political censorship has become something quite contradictory for me. On the one hand, I’m skilled at it and it puts food on the table. On the other hand, it’s . . . really painful. My guilt becomes especially intense when I’ve personally experienced the events I’m censoring.

Q: Does guilt leave you politically depressed (政治性抑郁)?

A: I feel more like my conscience is condemning me. I really don’t like the term “politically depressed.” I’ve rarely heard it used before. I prefer to use the word “conscience” to express how I feel. This word better reveals my state of mind than “politically depressed.” [NOTE: “Political depression” refers to a form of depression triggered by political events or circumstances. It differs from typical clinical depression by manifesting when individuals feel powerless over societal or governmental situations.]

Q: After the condemnation of your conscience, what has your response been in real life?

“I feel more like my conscience is condemning me.”

A: I’ve started to really love chatting with my friends.

Q: Are you hoping for some external support from your friends’ words that might allow you to keep doing this job? Or is it that they’ll give a kind of “legitimacy” or approval to your work?

A: I can’t say there’s none of that, but I don’t think my intention is so clear.

Q: So what do you think your intention is in seeking out friends to talk to?

A: I just want to vent and find like-minded people. Of course I also want to ask them if there’s another path I can take. After all, this road [of being a content moderator] isn’t sustainable in the long run.

Q: In your eyes, what would make sense in the long run for you?

A: I have always wanted to use this job as a springboard that could help me jump to a civil service position in a staff role to an official. I chose to join a public opinion monitoring company because I wanted to build my skills in resolving public opinion risks — essentially crisis and threat management abilities — and then move on from there to a better platform. But such opportunities are rare and hard to come by. What I’ve always really wanted to do is to work at an organization like the State Council Development Research Center, where I could be an aide to a higher-ranking official, advising them on how to resolve crises. That’s the job I most want to do.

Q: But for this kind of job you need an excellent academic background and comprehensive skill sets.

A: The fact that I don’t have an excellent academic background is exactly why I wanted to enter a public opinion monitoring company, do political content moderation, and even excel at it. I thought that once I’ve developed certain capabilities, it might be easier for people to take notice of me. That’s been my personal aspiration.

Q: So now that you’ve achieved a level of technical skill in political content moderation, do you think your personal capabilities have improved?

A: There’s been no qualitative improvement, only some quantitative, or self-perceived improvement.

Q: What improvements have you noticed in yourself?

A: I think because I’ve seen so many bits of information, I’m now able to use alternative methods to resolve issues. But my perspective has always been that of an ordinary person. I’ve never considered problems from the viewpoint of a high-level official. That’s the biggest challenge for an aide. I think it’s also my biggest obstacle to becoming an aide, because the perspective is a completely different one.

Q: But have you never thought that, in doing content moderation, you also aren’t at the vantage point of a normal person or even of yourself? Aren’t you always having the vantage point of someone in control?

A: That’s right, but this vantage is just one aspect of what it means to put out fires — meaning that by putting out fires you can calm things down. But my position isn’t completely that of someone in control because beyond implementation, I really have no idea how their strategy is designed, or how those making the strategy communicate with the government news office.

These factors have severely limited my growth. I can’t even attend those meetings. I can only guess how they communicate with leadership. I used to believe that after doing a good job with content moderation, I would get opportunities to advance and increase my value personally. Eventually, though, I found I couldn’t even become a formal employee.

Q: Do you feel your situation is tragic? And if you could step back and view it objectively, how would you evaluate the profession of content moderator?

A: It really is tragic. Looking at content moderators as a group, I can only say these are people working to eat, to survive. It’s simply about putting food on the table, nothing more than that. As for questions about “righteousness” — nobody really examines that deeply. When you can’t even feed yourself, what righteousness is there to speak of? You could call it the helplessness or sorrow of people at the bottom, because for people at the bottom, there aren’t many choices.

So I’ve never accepted moral criticism of this job. Shouldn’t moral criticism be directed at the middle class? Why throw it at those at the bottom? What I want to ask now is: who designed the position of content moderator? We’re just doing labor. This definitely isn’t a problem with the job itself, but with the people who created it. So the criticism should be directed at the source of the problem, not at those of us who are just following orders. After all, there’s an endless supply of cogs. If not us, there would be others.

I think there’s a good saying — that the rights and position you have determine what responsibilities you should bear. You can’t transfer too much social responsibility to ordinary people. Ordinary people already struggle to survive. They can’t shoulder such heavy expectations.

Q: So do you believe that there is some way to address the source of this problem?

A: No. And there are so many people in society who need to eat. If you really don’t let people do this job anymore, how are they supposed to survive? People might say, “Oh, it’s just a job.” But if this job doesn’t exist, do you have another for me? Do you really? So not resolving the issue also has its benefits. Ultimately, you can only address moral problems after solving the problem of having enough to eat. If you can’t resolve even basic subsistence issues, we’re nothing more than beasts.

So I really dislike certain experts and scholars. They fundamentally don’t understand lower-class life. I’m genuinely from the lower class — though not the absolute bottom since I do have an office job. But I’m not middle class either. I’m somewhere in between. Not quite middle, not quite bottom. But according to my standard of living, I’m definitely lower class. Our existence is really difficult. I need to eat. That’s my primary goal. How can I have enough moral sense to consider all these other complicated things?

Q: What are your plans for the future?

A: Right now, I have the feeling that I’ll never get ahead. No matter how hard I work, nothing will come of it. I’m just a cog in the machine. I’ll never stand out in such an enormous system, and I have no influence on that system. I also don’t have any sense of personal accomplishment. My idol has always been the Founder of the Republic [Sun Yat-sen], but I fear that throughout my entire life I’ll never take even a single step toward becoming like him.

Q: What is it you admire about him?

A: He was bold and daring. He was the one who put an end to more than two millennia of China’s feudal imperial system. Actually, I really don’t like the CCP. My grandfather’s younger brother joined the Kuomintang in 1949. Our family ancestors were wealthy landowners, but because of my grandfather’s brother, our family was treated miserably after 1949. That’s why my grandfather would quietly curse the CCP at home, and it’s also the source of the hatred I’ve inherited.

“No matter how hard I work, nothing will come of it. I’m just a cog in the machine.”

Q: Why did you wish to take the civil service exam?

A: Only a general, I think, can give the order to raise gun barrels an inch higher. If I want to become a general, taking the civil service exam is the only path available. Actually, in high school I secretly vowed never to join the military, the Party, or the government. Back then I even hoped to go to Taiwan and join the National Army, but that was impossible to achieve. As I grew up, I realized how absurd those ideas were, which is why I started thinking about the civil service exam. I feel becoming a civil servant is the only way I can have any influence or change anything.

Q: Will you take the civil service exam again in the future?

A: The civil service exam is too competitive now, and I’m a poor test-taker. Given my abilities, the best I can hope for is to apply for positions in small towns and villages, but these places can’t even pay salaries now. So what would be the point of passing the exam for those positions? So I really wish I could find someone who is able to tell me what I should do next.

This is a translation in cooperation with ChinaFile and China Digital Times of an interview published by the exile outlet Mang Mang. The Chinese original can be found here.


Xiao Wang

Wang Xiao (王晓) is a freelance contributor to the Chinese diaspora outlet Mang Mang (莽莽), an independent magazine based in Europe that was born out of the wave of resistance in China and globally in 2022 that became known as the White Paper Movement (白紙運動). Founded by a group of young Chinese expatriates, Mang Mang focuses on the Chinese community abroad — writing about activism, resistance, connection, history, and identity.

Chen Lijia

Born in 1997, Chen Lijia (a pseudonym) became an internet content auditor for a well-known Chinese internet company in 2020. The company’s user base has surpassed one billion, and all content generated by these one billion users must all pass through the hands of the company’s massive content moderation team — of which Chen Lijia is but one member.