5 min (1403 words) Read
Case Study

AI has accelerated shifts in battlefield dynamics, and policymakers are playing catch-up

The war in Ukraine has become a testing ground for new technology, in particular demonstrating how artificial intelligence (AI) can be used to great effect. But it has also highlighted weaknesses in how governments and the defense industry adopt, deploy, and control AI-based technology.

AI has been used in several ways in the Ukraine war, from broad strategic decision-making—through how to act on real-time or recent intelligence at the local level—to handling more mundane tasks, such as predicting logistical challenges. A fourth use involves information warfare. This is a way of leveraging AI to, in the words of Matthew Ford of the Swedish Defence University in Stockholm and coauthor of a book on the battlefield digital explosion, Radical War: Data, Attention and Control in the 21st Century, “shape how narrative construction works.”

Speed

But even though the war has shown that AI can help armies monitor enemy movements and deliver payloads remotely and autonomously, it has also accelerated shifts in battlefield dynamics. Forces soon alter tactics, techniques, and procedures either to leverage the new technology or to mitigate its impact.

A failure to adapt quickly can be seized on by an agile foe. When Russian soldiers and pilots communicated without encrypting their conversations, Ukraine developed AI-based voice recognition and translation software to monitor these communications and extract actionable intelligence. And even when countermeasures are adopted, each side must be ready to rethink and enhance its technology as rapidly as the other. When Russia introduced electronic jamming to thwart Ukraine’s combat drones, for example, Kyiv’s cadre of programmers developed an AI tool to help its drones evade Russian jamming and stay locked on target.

Unpiloted drones

This technological arms race is strikingly different from how many military thinkers saw the deployment of AI. For one thing, the principles behind unpiloted aerial vehicles, or UAVs, have not changed significantly since the 1990s. But in Ukraine the range of drones, and their capabilities, has evolved rapidly, largely by coupling them with continuous advances in AI. While the military-grade Turkish-made Bayraktar TB2 drone played a key role in Ukraine’s defense in the early months of the 2022 invasion, it became less useful as Russia upgraded its air defense and electronic warfare capabilities.

With more permanent battle lines drawn later in the year, Ukraine pushed its drone makers to adapt. The result has been a succession of improved and diverse devices. In September, for example, Kyiv approved the deployment of homegrown Saker Scout drones, which can detect enemy targets often missed by the human eye, even when hidden under camouflage.

This emphasis on rapid evolution has helped change thinking among military strategists, says Lauren Kahn, senior research analyst at Georgetown University’s Center for Security and Emerging Technology (CSET). Despite excitement about AI in military circles since 2021, if not earlier, practical examples were either hypothetical or project based. “That changed after Ukraine,” she says. Planners began to see that AI was not just a box to tick but raised a series of searching questions about what would make it useful: data, knowledge about your own side and the other, testing and evaluation procedures. The creative way Ukraine has developed drone technology is something “no one could have imagined,” she said.

Data

The Ukraine war has highlighted the importance of data—the fuel that powers AI—but has also raised troubling questions for policymakers and planners. Ukraine understood early that what constituted data in a war had shifted. It quickly reconfigured a government app for filing taxes to also allow citizens to upload photos, videos, and other details about Russian troops and positions to a database run by the military.

It combined commercially available satellite images with classified data from its allies, as well as from hacking into Russian surveillance cameras and from its own fleet of drones. But for all this data to become actionable intelligence, Kyiv had to turn to private tech companies—the most visible was Palantir, a US company specializing in big data analytics. Palantir’s involvement extended the role a private company might play in processing sensitive data, especially during a war. Its chief executive, Alex Karp, is on record as saying the company is responsible for most of the targeting in Ukraine. According to CSET’s Kahn, “It’s almost like a full service they provide, which I think has proved invaluable.”

What hasn’t been fully considered, at least publicly, are the implications. Private companies, says the Swedish Defence University’s Ford, are going to be crucial, because they are the only organizations that can develop the kind of AI armed forces can use. But, he asks, “Once it’s out there, where does it go next? How’s it going to be controlled, shaped, or directed?”

Digital battlefield

The war also introduces another aspect of AI and data. “The Ukraine-Russia war is the most documented war in history,” says Andrew Hoskins, professor of global security at the University of Glasgow and Ford’s Radical War coauthor. Telegram, the social media platform now used by three-quarters of Ukrainians and well over a third of Russians to share videos and photos as the war plays out in front of them, “is the digital battleground of this war,” he says. That information is not being uploaded only to army and intelligence servers, but also to NGOs and investigators mining it to catalog human rights abuses for future war crimes trials. AI, too, is improving what can be seen and extracted, says Hoskins. When you apply AI to these archives, “you start to find things you never anticipated.”

JEREMY WAGSTAFF is a technology and media consultant and former journalist at the BBC, Reuters, and the Wall Street Journal.

Opinions expressed in articles and other materials are those of the authors; they do not necessarily reflect IMF policy.