Commentary
by
Emily Harding
and
Julia Dickson
Published December 11, 2024
This series—led by the Futures Lab and featuring scholars across CSIS—explores emerging challenges and opportunities likely to shape peace negotiations to end the war in Ukraine. All contributions in the series can be found by visiting Strategic Headwinds: Understanding the Forces Shaping Ukraine’s Path to Peace.
“You have killed two generations of your people to launder our tax dollars. Soon you will have judgement day.”
“But you don’t represent Ukraine. Only the Kiev Junta that tries to rule the rest by violence.”
“Time for Russia to finish Ukraine . . . Time to arrest Zelenskyy for exploiting the war to enrich himself. Time for Ukraine to become Russian soil.”
—Posts exhibiting extremist views on Ukraine have proliferated on X
Negotiations for peace in Ukraine will face a headwind in the form of ones and zeros. In the run-up to talks, real critics and bots are likely to unleash a torrent of anger on social media, screaming from both ends of the spectrum. Politicians can easily interpret the screaming as reflective of public opinion, whether those opinions are real, represent a tiny minority, or are generated by Russian bots.
Moscow, Beijing, and others will have a strong incentive to shape public perceptions of a deal, with a goal of raising pressure on Zelenskyy and shaping the deal itself. In response, just as the United States and its allies worked to counter Russian disinformation before the invasion of Ukraine in 2022, so too must these allied governments work to counter Russian attempts to shape the information space to benefit Moscow’s position in a negotiation. Further, Ukrainian president Volodymyr Zelenskyy must reinvigorate his own stellar public relations efforts to counterbalance uninformed or nefarious critics.
The Public Square or a Tragedy of the Commons?
Disinformation works best when it hides in plain sight, cloaked in a veneer of “truthiness,” leading audiences to accept what they see online as fact rather than evaluating content for accuracy. One false idea can easily spread to millions of people on social media, quickly affecting global decisions and perceptions. The Covid-19 pandemic proved a natural test bed for disinformation and proved how deadly it can be. Erroneous ideas about how to cure or prevent Covid-19 included that patients should ingest bleach, use cocaine, or drink water every 15 minutes to cure the virus. According to the World Health Organization, in the first three months of 2020, approximately 6,000 people were hospitalized and 800 people died due to such misinformation. Similarly, false information affected people’s willingness to receive the Covid-19 vaccine, also resulting in preventable deaths. False rumors included that the vaccine causes fertility problems, alters people’s DNA, or contains a tracking microchip. In 2022, Brown University estimated that nearly one-third of the United States’s 1 million deaths during the pandemic could have been prevented if people had been vaccinated. While it is impossible to know how many people refused vaccination because they believed the disinformation, the conspiracy theories likely cost lives.
Similarly, a bot farm’s thousands of opinions can scream a message and seem like a mainstream opinion. In a recent example, a bot farm and thousands of TikTok profiles elevated a little-known candidate’s online popularity and helped him win an election. In late November 2024, Călin Georgescu, a relatively unknown, ultranationalist, and pro-Russia candidate won the first round of Romania’s presidential election, shocking the country and raising questions about the role of social media platforms. The Romanian Intelligence Service (SRI) confirmed the activity of domestic bot farms, troll operations, and fake social media accounts. SRI highlighted that Georgescu was “massively promoted on . . . TikTok through coordinated accounts, recommendation algorithms and paid promotion,” and researchers attribute his win to his massive online presence.
Disinformation surrounding peace talks has the potential to combine the worst of these worlds. Social media can quickly advance a narrative that paints the parties as betraying their country or giving up too many concessions. Those seeking to derail negotiations can spread deepfakes purporting to show ceasefire violations. Conversely, social media can suggest a false consensus pushing for peace at any price, when the bots just scream the loudest.
In negotiations to end the war in Ukraine, Moscow will have every incentive to unleash its information war reserves. Bot farms are likely to start churning out propaganda that the United States and Europe have abandoned Zelenskyy, that the Ukrainian people have lost faith in him as a leader, or that those in the Donbas have decided to end the conflict and join with Russia. Fringe propaganda jumps to the mainstream when the news services report what “people are saying” on social media, amplifying messages that may be fake and giving a veneer of credibility to nonsense.
Tune Out the Noise and Help the Truth Get Its Boots On
Zelenskyy must recognize this pressure as merely a distraction and tune it out. He should focus his efforts on playing to his astonishing strength: communicating. As he makes decisions about negotiating or not negotiating, he needs to aggressively communicate the “why” behind those decisions to his people and the world. As the adage goes, “A lie can get halfway around the world before the truth gets its boots on,” but Zelenskyy is better than most at breaking through the lies with his messages. Meanwhile, he should have trusted, hardened, skeptical staff serving as a filter for any incoming interpretations of public opinion. He should stay away from the news.
The United States has historically been terrible at fighting disinformation. Washington is too slow, too careful, and too over-lawyered to respond quickly to a sophisticated information campaign. (The one exception is the National Park Service, which is on point with its social media.) But the United States is far better at anticipating likely information campaigns and “pre-bunking.” Washington should focus tightly on getting ahead on the information game before negotiations begin, in close coordination with Zelenskyy. It should pre-bunk likely Russian information operations around a loss of Ukrainian public confidence in the government or especially a falling off of U.S. and European support for the Ukrainian war effort.
Once, the victor in a conflict got to write the history. Today, that history is written in real time, as events unfold, and it will shape events in a way it never has before. It should be a high priority of the United States, Europe, and Ukraine to drive the narrative, rather than be driven by it.
Emily Harding is the director of the Intelligence, National Security, and Technology Program and vice president of the Defense and Security Department at the Center for Strategic and International Studies in Washington, D.C. Julia Dickson is a research associate for the Intelligence, National Security, and Technology Program at CSIS.
Commentary is produced by the Center for Strategic and International Studies (CSIS), a private, tax-exempt institution focusing on international public policy issues. Its research is nonpartisan and nonproprietary. CSIS does not take specific policy positions. Accordingly, all views, positions, and conclusions expressed in this publication should be understood to be solely those of the author(s).
© 2024 by the Center for Strategic and International Studies. All rights reserved.
Tags
Defense and Security, and