UN expresses fear that 'Israel' is using AI in airstrikes on Gaza
Dubbed ‘Lavender’, the system analyzes the personal data of Gaza’s residents and picks out targets,
At a press briefing at UN headquarters in New York, UN Secretary-General Antonio Guterres voiced concerns on Friday over recent reports that "Israel" is using artificial intelligence in its airstrikes on Gaza.
A report published by Israeli-Palestinian +972 magazine earlier this week, citing Israeli intelligence, revealed that the IOF applied the use of AI to locate targets for bombing and was used in areas including densely populated residential ones, leading to mass civilian casualties.
Dubbed ‘Lavender’, the system analyzes the personal data of Gaza’s residents and lists those suspected of ties to the Palestinian Resistance group Hamas and Palestinian Islamic Jihad (PIJ).
Israeli utilization of sophisticated AI technology in its genocidal campaign in Gaza marks new territory in modern warfare, adding to the legal and ethical scrutiny and reshaping the dynamics between military personnel and automated systems.
Yesterday, one intelligence officer who used Lavender said, "The machine did it coldly. And that made it easier,” he said.
“I would invest 20 seconds for each target at this stage, and do dozens of them every day. I had zero added-value as a human, apart from being a stamp of approval. It saved a lot of time,” said another soldier.
'Silence the guns' and use AI instead?
Marking six months since the start of the war, Guterres expressed being “deeply troubled” by the reports, adding: “No part of life and death decisions which impact entire families should be delegated to the cold calculation of algorithms”.
“Six months on, we are at the brink: of mass starvation; of regional conflagration; of a total loss of faith in global standards and norms. It’s time to step back from that brink – to silence the guns – to ease the horrible suffering,” he said.
Read next: 'Israel' militarizes AI in the West Bank, Gaza, Syria: Bloomberg
“I have warned for many years of the dangers of weaponizing AI and reducing the essential role of human agency. AI should be used as a force for good to benefit the world; not to contribute to waging war on an industrial level, blurring accountability.”
"Israel" has not yet acknowledged or admitted the existence of ‘Lavender,’ but the system is notorious for being used in previous operations in Gaza.
In response to the claims, the IOF claimed that it “does not use an artificial intelligence system that identifies terrorist operatives or tries to predict whether a person is a terrorist,” while the “system.. is simply a database” used by the IOF to cross-reference intel sources.
The IOF attempted to divert attention away from the claims in its statement saying that Hamas intentionally locates its members “in the heart of the civilian population,” using civilians “as a human shield.”
It continued to claim that its strikes are directed towards military targets solely (as it tries to prove it complies with international law), but as a result of the density of the population, it is hard to avoid “exceptional incidents”.
Read more: Report details how every Israeli kill in Gaza was planned, intentional