Ukraine’s drone war accelerates AI weapons race: FT
AI-powered drones in Ukraine are transforming warfare, operating with limited human input amid ethical and regulatory concerns, according to the Financial Times.
-
Helsing's HX-2 electrically propelled X-wing precision munition, undated. (Helsing)
According to a recent Financial Times (FT) report, experts are saying the advancement of artificial intelligence (AI) is rapidly pushing warfare toward a future where machines could one day make lethal decisions on their own.
Earlier this month, Ukrainian forces claimed they took down a Russian fighter jet using missiles launched from an unmanned naval drone.
It was a striking example of how the Ukraine war has become a proving ground for advanced military technologies, especially drones powered by AI, according to the FT.
Today, Russian air defense systems intercepted and destroyed 99 Ukrainian fixed-wing drones in what the Defense Ministry described as a major escalation in drone warfare, and in October, Russian air defense systems shot down 110 Ukrainian drones across the Kursk, Lipetsk, Oryol, Nizhny Novgorod, Belgorod, Bryansk, and Moscow regions.
The Financial Times report argues that AI now plays a growing role in Ukraine’s war strategy. In 2024 alone, the country reportedly obtained nearly 2 million drones, of which 10,000 were AI-enabled, according to Kateryna Bondar, a fellow at the Center for Strategic and International Studies and former adviser to the Ukrainian government.
What makes AI drones different?
AI-enabled drones are built to operate with more independence. Some are improvised from consumer-grade parts and open-source AI software, while others are state-of-the-art platforms built by Western warefare firms like Anduril, Shield AI, and the German startup Helsing, according to the Financial Times.
Despite the wide range in price and performance, the goal is the same: to give drones the ability to navigate, recognize targets, and act, even when communication with a human operator is lost.
As reported by the Financial Times, “in its simplest definition, an AI-enabled drone is one where certain core tasks have been handed off to artificial intelligence,” said Ned Baker, the UK managing director for Helsing, which recently sold 6,000 HX-2 attack drones to Ukraine after delivering 4,000 HF-1s.
AI has become especially important in Ukraine because electronic warfare systems frequently jam radio and GPS signals, making manual drone operation impossible. In such cases, AI steps in to replace human control, allowing drones to fly, detect, and even collaborate with other drones without a live connection to an operator, according to the Financial Times.
A leap toward battlefield autonomy
According to the report, these drones rely on technologies like computer vision, similar to what’s used in commercial drones for sports or filming. But now, instead of following pre-set routes, these systems can detect and interpret their surroundings, enabling them to make decisions in real-time.
Bondar notes that while no drones in Ukraine are fully autonomous yet, the underlying tech makes that possibility increasingly realistic, as reported by the FT.
The big difference from earlier “autonomous” weapons, like missiles that follow preprogrammed paths, is that AI drones can analyze, react, and adapt without needing fixed instructions. They can be updated frequently, too, Helsing, for example, pushes software updates every two weeks to allow battlefield users to unlock new capabilities, according to the Financial Times.
Possible challenges
Not all experts are convinced the tech is ready. As reported by the Financial Times, Nick Reynolds, a defense researcher at UK think tank RUSI, says that while manufacturers make bold claims, many still lack real-world combat data needed to train and refine AI systems. And even if they have access, the cost and complexity of AI hardware makes it hard to build reliable drones at scale, especially when most are expected to be disposable.
What are the implications and potential uses of AI in a hypothetical future conflict? Check out our new conference report that provides a summary of discussions from a workshop exploring the use of AI in intelligence. https://t.co/8TWB17FvFw
— RUSI (@RUSI_org) May 23, 2024
Bondar adds that the use of “primitive” tethered drones, those connected to fiber-optic cables to bypass jamming, is a sign that AI software is still far from perfect and demands a lot of time, money, and technical expertise to get right, according to the Financial Times.
Ethical fears and regulatory gaps
Perhaps the most troubling questions are ethical. What happens when machines are allowed to kill without human approval? For now, Ukraine’s AI drones operate with a human “in the loop,” meaning a person must sign off on targeting decisions.
But both Bondar and Reynolds say we’re entering a “transition period,” where machines take on more responsibility, and oversight may slip.
“We’re going through a shift,” said Helsing’s Baker. “It’s no longer 100% human, and not yet 100% AI, but it’s heading that way,” he said, as reported by the Financial Times.
Reynolds warns that the global debate on regulation hasn’t caught up with reality. The pace of innovation, and the urgency of war, often outstrip ethical concerns, and few international frameworks exist to limit or control autonomous weapons development, according to the Financial Times.
Despite these challenges, AI-powered drones are now on the agenda of nearly every major and mid-sized military power, according to Baker. And as both sides in Ukraine continue to test and deploy the latest tech, the results could shape the future of warfare far beyond this conflict.
As reported by the Financial Times, “AI on the battlefield,” Baker says, “will be as important as the invention of gunpowder or the tank, though in ways we’re only just beginning to understand.”