HRW warns IOF use of AI, digital tools risk civilian lives in Gaza
The HRW report highlights that AI tools, which result in the killing of civilians in Gaza, rely on flawed data and imprecise methods, potentially violating international humanitarian laws.
Human Rights Watch has raised concerns about the Israeli occupation's use of surveillance technologies, artificial intelligence (AI), and other digital tools that maximize civilian death and destruction during the war on Gaza.
These tools, which "Israel" claims are intended to assess civilian presence, guide attack timing, and differentiate between civilians and combatants, are increasing the risk of civilian harm.
The report highlights that the tools rely on flawed data and imprecise methods, potentially violating international humanitarian laws. HRW criticizes the design and application of these tools for potentially causing unlawful civilian casualties. Additionally, the tools involve extensive surveillance and use of personal data, which raises further legal and ethical issues.
Zach Campbell, senior surveillance researcher at Human Rights Watch stated that "Israel" is using "incomplete data, flawed calculations, and tools not fit for purpose to help make life and death decisions in Gaza, which could be increasing civilian harm," pointing to the problems in the designs and use of the tools.
HRW assessed the four tools using Israeli officials' statements, previously unreported material, media reports, and expert and journalist interviews. This material, while partial, includes crucial facts regarding how these tools work, how they were designed, what data they utilize, and how they might help military decision-making.
The first is an evacuation monitoring tool based on mobile phone tracking to monitor Palestinian evacuations from northern Gaza. According to HRW, this tool may be compromised due to damage to Gaza's communications infrastructure by indiscriminate Israeli strikes, rendering it less reliable for military decisions.
The Gospel is the second tool that generates lists of buildings and structures to be targeted.
Lavender, the third tool, assigns ratings to individuals to assess how involved they are in the Palestinian Resistance movements. Where’s Daddy? is a tool that identifies when a target is at a specific location, often their family home, for potential attacks.
The Gospel and Lavender algorithms are prone to biases and inaccuracies, potentially leading to wrongful targeting of civilians. HRW highlights that algorithmic outputs often reflect programmer biases and can be excessively trusted despite the incomplete data they rely on.
Israeli utilization of sophisticated AI technology in its genocidal campaign in Gaza marks new territory in modern warfare, adding to the legal and ethical scrutiny and reshaping the dynamics between military personnel and automated systems.
— Al Mayadeen English (@MayadeenEnglish) April 7, 2024
The organization called for impartial investigations to assess whether these tools have unlawfully contributed to civilian harm.
Campbell emphasized that “the use of flawed technology in any context can have negative human rights implications, but the risks in Gaza couldn’t be higher. The Israeli military’s use of these digital tools to support military decision-making should not be leading to unlawful attacks and grave civilian harm."
Back in April, a group of UN experts discussed the intentional use of artificial intelligence (AI) by "Israel" in its war on Gaza and its genocidal campaign.
The experts cited the UN Special Rapporteur on the Occupied Palestinian Territory, Francesca Albanese, who said in her recent report to the Human Rights Council that the systematic destruction and abolition of housing, services, and civilian infrastructure constitute a crime against humanity and a domicide, alongside numerous war crimes and acts of genocide.
They added that it is almost clear how much "Israel's" intents go way beyond getting rid of the Palestinian Resistance Hamas since there are widespread attempts to force people to leave Gaza.
"If proven true, the shocking revelations of the use of AI systems by the Israeli military such as "Gospel", "Lavender" and "Where's Daddy?", combined with lowered human due diligence to avoid or minimize civilian casualties and infrastructure, contribute to explaining the extent of the death toll and home destruction in Gaza," the experts continued.