Beijing and Moscow, beware: AI trained on media from democratic countries–as is most of the internet–will interpret the world in a different way than their propaganda dictates. Type in a question to ChatGPT or Google Bard about the conflict in Ukraine, and the large language models will use the banned word “war.” Instead of ignoring the Tiananmen protests, the full scare of the massacre is detailed in terrifying detail.
Admittedly, authoritarians will attempt to leverage AI facial recognition tools to reinforce control. More cameras mean more opportunities to create a 1984-style Orwellian dystopia. China uses AI-powered biometric and social media monitoring to exert control over its Uyghur Muslim minority. It exports surveillance technologies around the globe, allowing India to exploit facial recognition to locate protesters, Pakistan to keep an eye on online traffic, and Serbia to monitor streets.
But authoritarian countries will struggle to reign in the expanding powers of generative AI. China recently announced new rules to keep AI bound by ‘core socialist values.’ The rules will limit the country’s AI technological advances. Generative AI similarly frightens Russia.
Here’s why. Authoritarians rely on narrative homogeneity as a tool for control, viewing any deviation as a potential threat. Generative AI represents a threat to this narrative homogeneity. Authoritarians must contend with a plethora of news, media, and cultural productions that will challenge their accepted version of reality.
Consider a little digital history prior to AI. The internet’s evolution ushered in a radical shift in narrative formation, primarily through the transformation of information retrieval. Traditional media newspapers and television relied on journalists and editors to select and craft narratives. The internet offered a new paradigm of self-directed information discovery.
It also created chaos. Until Google arrived, it was a monumental task to sift through the boundless sea of data. The search engine surfaced results based on links between sites. Results reflected the biases of webpage publishers, who preferred reputable sources.
For authoritarian regimes, this development posed little threat. Search engines delivered deterministic and easy-to-block results. With strong pre-existing narratives in place, search engines proved more likely to reinforce existing views than to provoke a shift in perspectives. China welcomed Google as long as it adhered to a stringent censorship regime, while Russian interventions in Google results in the 2000s focused on suppressing results that supported Chechen rebels.
But search engines’ reign as the only form of digital information retrieval proved short-lived. Social media platforms, powered by recommendation engines, delivered an endless stream of new information–consumed passively, as opposed to search engines where users actively sought answers. While Google depended on what content creators thought best bolstered their arguments or was most worthy of visiting next to deliver information, recommendation engines relied on what content consumers were likeliest to share, comment on, or like.
In a sense, this was an act of democratization: the population of consumers is far larger than the number of creators. Consuming anti-regime content, especially when delivered by an impersonal algorithm rather than actively sought, is far safer than creating it. As social media rose, so did upheaval in North Africa, the Middle East, and the former Soviet sphere of influence.
But these uprisings did not result in democracy. Social media prizes high activation emotions. Recommendation engines had no ability to distinguish between outrage at dictators for human rights violations and outrage at dictators for failing to uphold the dictates of religious fundamentalists. As long as the narrative spread, social media spreads it.
The authoritarians cracked down. Russia began to demand censorship of Facebook and Twitter, while China banned both and hired an army to monitor its homegrown equivalents. State-owned China Internet Investment Fund took a controlling stake in Sina Weibo, the country’s most popular social media app, and a major stake ByteDance, the parent of Daiyoun, China’s TikTok equivalent. After the 2011 protests in Moscow, the Russian government helped oust the original founder of VK, the country’s leading domestic social media service. A loyal oligarch was installed in 2022. The grip over social media boosted the party line in China and increased support for Russia’s invasion of Ukraine.
While tipping the scales of recommendation engines to favor certain narratives has proven possible for authoritarians, generative AI represents a different beast. With its ability to produce fully synthesized narratives, AI has the potential to expose users to new — and from the authoritarians’ perspective — dangerous narratives.
AI could encourage users to circumvent the censors. It is on the path to becoming a crucial assistant for everyday tasks. Social media, in contrast, is largely a distraction in our daily lives. When that trusted advisor lets slip the truth of the “special military operation” or the massacre at Tiananmen Square, the audience will listen.
Ben Dubow is a Nonresident Fellow at CEPA and the founder of Omelas, which specializes in data and analysis on how states manipulate the web.
Bandwidth is CEPA’s online journal dedicated to advancing transatlantic cooperation on tech policy. All opinions are those of the author and do not necessarily represent the position or views of the institutions they represent or the Center for European Policy Analysis.