Digital disruption has been the name of the game in China’s media space for several years now. The same technology wave that has inundated everything from transportation to brick-and-mortar retail has hit traditional media hard, leaving wreckage in its wake.
Late last year, the Beijing Times newspaper closed its doors after more than 15 years of operation, and on January 1, Shanghai’s Oriental Morning Post, long one of the country’s leading commercial newspapers, also closed up shop. More such extinctions are to be expected, diminishing choice at the Chinese news stand.
But another important element of digital disruption is the radical re-envisioning of choice itself. Why browse the newsstand or fumble through the inky metro tabloids when algorithms can do the work for you? Show an inclination to check out real estate content through a news app like Toutiao (今日头条), or “Today’s Headlines,” and the app will pre-select for you, modelling your behavior to generate customized news feeds.
Toutiao and other apps powered by artificial intelligence technologies have been a space of fevered activity in China since around 2012. For Chinese authorities, however, this type of disruption apparently has its limits.
On July 6, the Party’s official People’s Daily ran a commentary piece called “News Must Not Be Hijacked By Algorithms.” The piece was read by many as a direct attack on Toutiao, the industry leader in the field of AI news.
The piece attacked news apps “promoting low quality content” to the detriment of “truth, comprehensiveness, objectivity and independence.” Recalling that the notion of “guidance of public opinion,” or ensuring social and political stability through media control, is the true value underlining the Party’s approach to information in China, one must wonder whether one of the perceived dangers in the rise of AI news apps is the way they might allow readers to select out the sorts of messages authorities would like them to absorb.
Perhaps China’s propaganda authorities are coming to realize that the selection methods now being advanced through these apps might be undermining the Party’s own efforts to “guide” the information and ideas to which the public is exposed.
A translation of the People’s Daily piece follows:

News Must Not Be Hijacked By Algorithms
By Lu Hong (吕洪) / July 6, 2017
Recently, news apps drunk on technologies and algorithms have become more and more “basic and coarse.” Just open a single article and they will rapidly flood your screen with related content, without even extending you the right of refusal. Some people even elected these news distribution algorithms to the plane of artificial intelligence, suggesting they are a major development trend representing the future of text, the future of content, and even the future of media.
What is artificial intelligence? Artificial intelligence is the simulation of the information process of human thought. These algorithmic technologies based on social networks and click rates, and especially these small-time mechanisms of machine selection and their hard-pushing of content, cause users great annoyance.
Algorithms can have some positive impacts on news production. On the one hand, they can induce content producers to pay greater attention to content of interest to audiences, approaching content production from the reader’s perspective — so that no longer do they talk to themselves alone. On the other hand, they have diminished the time investment required of consumers and raised the efficiency of reading, to the benefit of enriching information and knowledge.
However, a number of apps, keen on news delivery and indulging in algorithms, have had a very negative impact on the future of news. Taking the so-called “free ride” of the algorithm, some news apps that were previously quite rich informationally are more and more bland, some content producers that were previously impartial are now rather biased, and the once broad reach of some news media has been increasingly narrowed.
In recent years, there have been many advancements in artificial intelligence in various sectors, but these have remained somewhat detached from ordinary people. Why is this? Because artificial intelligence to date has been unblemished to achieve breakthroughs in non-linear thinking, this mode unique to human beings. We click into a piece of horrific social news out of instinctual human curiosity — but instinct is not intelligence. When different viewpoints interact, or even face off, this potentially leads to greater thought or knowledge. To be introspective about our instincts, and to surpass them, is a development of human nature. A number of news apps, relying only on parsing people’s clicking habits and promoting low quality content, can only cause them annoyance.
But are we to be held captive by algorithms, traffic and hits, only showing those parts of the world that readers want to see — or do we uphold the truth, comprehensiveness, objectivity and independence, using quality content to shape our style? This is the question media people must think deeply about.
Traditional media must not act like so many Don Quixotes, being blind to algorithms and technology and avoiding progressive trends. But they cannot at the same time rely on these entirely, becoming slaves to algorithms and technology. Traditional media must actively strategise and must actively participate in the process of [media] transition and convergence. But in the process of media convergence, they must maintain their own styles and standards, safeguarding their own values and independent spirits, injecting the soul of traditional media in the online space — allowing algorithms and technology serve news of true value.
<br


David Bandurski

CMP Director

Latest Articles