Pope urges ban on 'lethal autonomous weapons' in G7 speech
The Pope has lambasted the arms industry and those who profit from war and murder.
Pope Francis called for a prohibition on "lethal autonomous weapons" on Friday, becoming the first pontiff to address the G7 with a speech on the dangers of AI.
The religious leader, who has consistently lambasted those "who profit from war like the arms industry," expressed that it was "urgent to reconsider the development and use of devices like the so-called 'lethal autonomous weapons' and ultimately ban their use," adding that this begins with "effective and concrete" human control.
"No machine should ever choose to take the life of a human being," he told the summit in Puglia, southern Italy.
Francis was asked to speak at the Group of Seven summit by Italian Prime Minister Giorgia Meloni.
AI has shown to be quicker, but not necessarily safer or more ethical, and the development of weapons systems capable of killing without human interaction raises ethical and legal concerns.
"Artificial intelligence (is) at the same time an exciting and fearsome tool," Francis stated, adding that humanity would be condemned to no future hope if the ability to make decisions was taken, "dooming them to depend on the choices of machines."
In April, a group of UN experts discussed the intentional use of artificial intelligence (AI) by "Israel" in its war on Gaza and its genocidal campaign.
"Six months into the current military offensive, more housing and civilian infrastructure has now been destroyed in Gaza as a percentage, compared to any conflict in memory," they expressed in a statement.
NSA, US AI firm feeding raging Israeli killing machine: The Nation
According to a recent probe by +972 Magazine and Local Call, Unit 8200 is now utilizing a technique known as "Lavender" to locate targets for bombing and was used in areas including densely populated residential ones, leading to mass civilian casualties.
Israeli utilization of sophisticated AI technology in its genocidal campaign in Gaza marks new territory in modern warfare, adding to the legal and ethical scrutiny and reshaping the dynamics between military personnel and automated systems.
Lavender analyzes the personal data of Gaza’s residents and lists those suspected of ties to the Palestinian Resistance group Hamas and Palestinian Islamic Jihad (PIJ).
One intelligence officer who used Lavender said, "The machine did it coldly. And that made it easier."
Israeli intelligence sources revealed that they used a previously undisclosed #AI-powered database, dubbed #Lavender, during the #Gaza genocide and admitted to targeting civilians to reach Resistance fighters. pic.twitter.com/JIfcZksy0R
— Al Mayadeen English (@MayadeenEnglish) April 7, 2024
The report describes how the IOF sifts through surveillance data provided by the NSA to generate long kill lists for targets.
Unit 8200's targeting of innocent Palestinians in the occupied territories became so harsh that in 2014, 43 veterans of the Israeli unit publicly accused the organization of shocking atrocities, citing they had a "moral duty" to no longer partake in the "political persecution" of innocent Palestinians.
In media interviews and personal accounts, they divulged that data on Palestinians' sexual orientations, infidelities, financial issues, family illnesses, and other private matters were collected to coerce Palestinians into becoming collaborators or cause divisions in their society.
The investigation found that the IOF intentionally bombed entire families after AI algorithms told them one individual was present. "Thousands of Palestinians—most of them women and children or people who were not involved in the fighting—were wiped out by Israeli airstrikes, especially during the first weeks of the war, because of the AI program’s decisions.”