Issued by:
Top AI engineers from defense technology companies defended the need for autonomous weapons on Thursday amid a growing movement to ban so-called “killer robots.” More than 115 countries and 250 nongovernmental organizations are calling for an international treaty to ban weapons that use artificial intelligence to identify and attack human targets. U.N. Secretary-General Antonio Guterres has called the technology “morally abhorrent.”
“The battlefield is awash in data. Whoever can harness this data, understand it and make sense of it faster will have a tactical advantage,” Antoine Bordes, head of artificial intelligence at European defence startup Helsing, said at a conference at Sciences Po in Paris.
“The role of AI is not just to make faster decisions, but to make more accurate decisions,” said Megha Arora, responsible AI product lead at military software developer Palantir. “There are always ethical considerations on a case-by-case or customer-by-customer basis when we decide where to operate and what workflows to enable.”
European aerospace giant Airbus said on Wednesday it is partnering with Helsing GmbH to develop unmanned military aircraft.
Airbus senior engineer Bernhard Kraach praised EU defense agencies for their “heavy commitment” to working on the responsible use of AI. He pointed to existing safety measures, saying, “In the military domain, risk assessments are carried out.” Bordes agreed, saying weapons such as planes and drones “already have safety guidelines, standards and certification processes in place.”
Arora said existing international humanitarian law “provides a legal basis for the safe and appropriate use of autonomous weapons systems, but lacks implementation guidance.”
“When you think about proportionality, you can’t program it into code,” she said.
Weapons with autonomous capabilities are already being used in combat, including Ukraine’s Saker drone, Russia’s Lancet missile and Turkey’s Karg drone, according to the Autonomous Weapons Monitor.
Meanwhile, Israel is under investigation over its use of “Lavender,” a software that reportedly uses AI to recommend militants who might carry out attacks on Gaza.
In January, Palantir signed a “strategic partnership” with the Israeli Ministry of Defense to “supply technology in support of the Israeli war effort.”
This week on Tech 24, we take a closer look at autonomous weapons, and in particular the EU’s appetite for regulation around them.