“This technology is our future threat,” warns Serhiy Beskrestnov as he examines a newly captured Russian drone. Unlike conventional weapons, it uses artificial intelligence to locate and strike targets without human input.
Beskrestnov, a consultant for Ukraine’s defence forces, has analysed countless drones since the invasion began. This model stands out. It neither sends nor receives signals, making it impossible to jam or detect.
Both Russian and Ukrainian forces now deploy AI on the frontlines. They use it to locate enemy positions, analyse intelligence, and clear mines faster than ever before.
AI becomes Ukraine’s force multiplier
Artificial intelligence has become essential for Ukraine’s military. “Our forces receive over 50,000 video streams from the front every month,” says Deputy Defence Minister Yuriy Myronenko. “AI analyses the footage, identifies threats, and maps them for commanders.”
The technology accelerates decision-making, optimises resources, and reduces casualties. Its most visible impact comes from unmanned systems. Ukrainian troops now operate drones that lock onto targets and fly autonomously during the final stage of attacks.
These drones are nearly impossible to jam and extremely difficult to shoot down. Experts predict they will evolve into fully autonomous weapons capable of finding and eliminating targets independently.
Drones that act independently
“All a soldier needs to do is press a button on a smartphone,” explains Yaroslav Azhnyuk, CEO of Ukrainian tech company The Fourth Law. “The drone finds its target, drops explosives, assesses the damage, and returns to base. No piloting skills are required.”
Azhnyuk believes these drones could dramatically strengthen Ukraine’s air defences against Russian long-range drones like the Shaheds. “A computer-guided system can outperform humans,” he says. “It reacts faster, sees more clearly, and moves more precisely.”
Myronenko admits fully autonomous systems are still in development but says Ukraine is close. “We have partly integrated it into some devices,” he adds. Azhnyuk predicts thousands of these drones could be operational by the end of 2026.
Progress carries risks
Full automation carries serious dangers. “AI might not distinguish a Ukrainian soldier from a Russian one,” warns Vadym, a defence engineer who requested anonymity. “Their uniforms often look identical.”
Vadym’s company, DevDroid, produces remotely controlled machine guns that use AI to detect and track targets. Automatic firing is disabled to prevent friendly fire. “We could enable it,” he says, “but we need more field experience and feedback before trusting it fully.”
Ethical and legal concerns remain. Can AI follow the laws of war? Will it recognise civilians or surrendering soldiers? Myronenko stresses humans must make the final decision, even if AI assists. Yet he warns that not all militaries will act responsibly.
The global arms race escalates
AI is driving a new type of arms race. Traditional defences—jamming, missiles, or tanks—struggle against swarms of intelligent drones.
Ukraine’s “Spider Web” operation last June, when 100 drones struck Russian air bases, reportedly relied on AI coordination. Many fear Moscow could copy the tactic, both on the frontlines and deeper inside Ukraine.
President Volodymyr Zelensky told the United Nations that AI is fuelling “the most destructive arms race in human history.” He called for urgent global rules on AI weapons, stressing the issue is “as urgent as preventing the spread of nuclear arms.”
