VATICAN CITY (CNS) — The use of “killer robots” and other lethal autonomous weapons systems violate international treaties because innocent civilians could be erroneously targeted, the Vatican said during a U.N. meeting in Geneva.
The potential of having “swarms of ‘kamikaze’ mini drones” and other advanced weaponry using artificial intelligence raises “serious implications for peace and security,” the Vatican permanent observer mission to U.N. agencies in Geneva said in a statement Aug. 3 to the 2021 Group of Governmental Experts on Lethal Autonomous Weapons Systems (LAWS).
“The use of swarms in urban areas could lead to high risks for civilians,” the statement said. “If functioning without any direct human supervision, such systems could make mistakes in identifying the intended targets due to some unidentified ‘bias’ induced by their ‘self-learning capabilities’ developed from a limited set of data samples.”
For years the Vatican, particularly the observer mission in Geneva, has warned against the use and development of LAWS or, so-called killer robots, which include military drones, unmanned vehicles and tanks and artificially intelligent missiles.
At the August meeting, the Vatican mission said lethal autonomous weapons systems could potentially violate current international humanitarian conventions and treaties, which emphasize the need for “interpretation, good faith and prudential judgment” during armed combat.
“These aspects are, in part, informed by and based on the evolving context of operations, for which the human person is irreplaceable,” the statement said.
The use of advanced weaponry, devoid of human reason when it comes to applying the principles of “distinction, proportionality, precaution, necessity and expected military advantage” during combat, could lead to violations in established rules of engagement, the Vatican said.
Lethal autonomous weapons systems, “equipped with self-learning or self-programmable capabilities, necessarily give way to a certain level of unpredictability, which could, for instance, ‘deviate’ into actions targeting non-combatants in order to maximize efficiency, thus flouting the principle of distinction,” it said.
Furthermore, the Vatican noted the concerns of scientists, engineers, researchers, military leaders and ethicists, as well as “employees and entrepreneurs objecting on ethical grounds to certain projects dealing with the weaponization of artificial intelligence,” which “attest to the far-reaching implications” of using such advanced weaponry.
The Vatican’s permanent observer mission said that while lethal autonomous weapons system may be considered acceptable, there are “still behaviors that international humanitarian law prohibits, or that, although not explicitly prohibited, remain forbidden by the dictates of morality, by spiritual values, experience and soldierly virtues.”
Moreover, the Vatican said, “the end does not justify the means used to achieve it.”