World

How Israel is using AI to identify bombing targets in Gaza

An investigation by Israeli publications +972 Magazine and Local Call revealed that Israel’s military has been using artificial intelligence to swiftly select bombing targets in Gaza, prioritizing speed over precision, resulting in the deaths of thousands of civilians.

The report reveals the development of the AI system, dubbed Lavender, in response to Hamas attacks on October 7th. Lavender purportedly identified 37,000 Palestinians in Gaza as suspected “Hamas militants” and sanctioned their assassinations.

While the Israeli military refuted claims of a kill list, they confirmed Lavender’s existence, describing it as a mere tool for analysts in the target identification process.

However, interviews with Israeli intelligence officers revealed a concerning lack of independent examination of targets, instead acting as a “rubber stamp” for the machine’s decisions.

Data training and target identification

Lavender’s training dataset included data on known Hamas and Palestinian Islamic Jihad operatives, as well as loosely affiliated individuals, such as employees of Gaza’s Internal Security Ministry.

israel ai targeting

The system identified features associated with Hamas operatives, such as WhatsApp group membership or frequent changes in phones or addresses. However, its reported 90 per cent accuracy rate led to errors, with individuals with similar names or connections mistakenly targeted.

Intelligence officers were reportedly granted wide latitude in civilian casualty considerations, with initial directives allowing for the killing of up to 15 or 20 civilians for every lower-level Hamas operative targeted by Lavender.

Civilian casualties were exacerbated by the use of the “Where’s Daddy?” system, targeting suspected Hamas operatives in their homes, often resulting in the deaths of entire families.

Critics argue that Lavender is emblematic of Israel’s broader use of surveillance technologies on Palestinians, raising concerns about mass surveillance and civilian deaths during conflicts.

Similar technologies, including mass facial recognition and building identification systems, have also contributed to civilian casualties in Gaza.

Mona Shtaya, a non-resident fellow at the Tahrir Institute for Middle East Policy, warns that such technologies risk perpetuating collective punishment policies against Palestinians, emphasizing the need for accountability and scrutiny in wartime operations.

The Israeli military has yet to respond to these specific allegations. The revelations have sparked renewed debate over the ethical use of AI in warfare and the protection of civilian lives in conflict zones like Gaza.

Earlier, during a Security Council session, the head of the UN Relief and Works Agency for Palestine Refugees warned against the potential disbandment of the agency, stressing its critical role for millions of Palestinians.

Related Articles

Back to top button