Breezy Explainer: What is Lavender? AI behind Israel’s controversial kill lists in Gaza

Breezy Explainer: What is Lavender? AI behind Israel's controversial kill lists in Gaza

It has been six months since Israel launched its war against Hamas following its heinous attacks on October 7, 2023. In the days following the surprise attack, the Israel Defense Forces (IDF) launched a bombing campaign in the Gaza Strip, intending to eliminate Palestinian militants. Over 30,000 people have died as a result of this campaign, which has left the Gaza Strip in ruins. It is now claimed that these operations were carried out with the assistance of a secret artificial intelligence system called Lavender, which generated kill lists in Gaza.

Israel has denied the allegations, claiming that it does not use AI to identify people as targets for military operations. “Contrary to claims, the IDF does not use an artificial intelligence system that identifies terrorist operatives or tries to predict whether a person is a terrorist,” the IDF said in a statement. “Information systems are merely tools for analysts in the target identification process.”

As the topic gains traction, we take a closer look at the so-called AI system in question and how Israel allegedly employed it in its ongoing campaign against Hamas.

What is the Lavender AI program?

Israel’s suspected usage of the AI program known as Lavender has emerged following a joint investigation by Israeli-Palestinian newspaper +972 Magazine and Hebrew-language site Local Call, which identified six Israeli intelligence officials engaged in its use and development.

According to the study, Lavender utilized broad parameters to select potential targets, classifying approximately 37,000 people for possible air attacks. According to military officials, it employed machine learning to identify militant characteristics and assigned people a score of 1 to 100 based on parameters such as association with suspected militants and frequent phone changes.

Lavender and the gospel: Israel’s AI for targeting in Gaza

The +972 Magazine-Local Call investigation also found that the Lavender program had joined another AI system, The Gospel. The primary distinction between the two systems is in the designation of the target: while ‘The Gospel’ labels buildings and structures from which the army alleges insurgents operate, Lavender targets individuals and places them on a kill list.

According to the joint inquiry, Unit 8200, the Israel Defense Forces’ elite intelligence section, designed the program. This is analogous to the US National Security Agency and the UK’s GCHQ.

According to officials, the need for such a program developed immediately following the attacks on October 7, 2023. The study, citing intelligence officers, stated that commanders demanded continuous attacks on targets.

“We were constantly being pressured: ‘Bring us more targets.’ They shouted at us,” said one intelligence officer to The Guardian. “We were told: now we have to f**k up Hamas, no matter what the cost. Whatever you can, you bomb.”

To meet their commanders’ expectations, they started to rely on Lavender, which created a database of people thought to have militant tendencies.

How Israel used lavender

First-hand testimony from Israeli intelligence personnel who worked with Lavender has exposed how Benjamin Netanyahu’s forces employed this technology in the ongoing conflict against Hamas.

According to them, Lavender played an important part in the fight by analyzing massive amounts of data to quickly identify “junior” operators to target. According to a story in The Guardian, four sources indicated that at one point during the war, Lavender designated up to 37,000 Palestinians as affiliated with either Hamas or Palestinian Islamic Jihad.

Yuval Abraham, the author of the investigation, wrote, “During the early stages of the war, the army gave sweeping approval for officers to adopt Lavender’s kill lists, with no requirement to thoroughly check why the machine made those choices or to examine the raw intelligence data on which they were based.

“One source stated that human personnel often served only as a ‘rubber stamp’ for the machine’s decisions, adding that, normally, they would personally devote only about ‘20 seconds’ to each target before authorizing a bombing.”

How Israel’s AI led to civilian losses in Gaza

The magazine also reported that the Israeli army “systematically attacked” targets in their homes, usually at night when entire families were present. “The result being that thousands of Palestinians — most of them women and children or people who were not involved in the fighting — were wiped out by Israeli airstrikes, especially during the first weeks of the war, because of the AI program’s decisions,” it wrote.

According to reports, the IDF allegedly allowed for the killing of 15-20 civilians for every junior Hamas operative identified by Lavender, and 100 civilians for each senior operative. “We had a calculation for how many [civilians could be killed] for the brigade commander, how many [civilians] for a battalion commander, and so on,” one insider told The Guardian.

IDF condemned the coverage as bogus

However, Israel has denied the probe conducted by +972 Magazine-Local Call. The IDF denies utilizing AI to create kill lists.Notably, it did not deny the existence of the Lavender program. In a lengthy statement, it said: “The IDF does not use an artificial intelligence system that identifies terrorist operatives or tries to predict whether a person is a terrorist.”

“Information systems are merely tools for analysts in the target identification process.”

It also asserted that Lavender was essentially a database whose function was to cross-reference intelligence sources to provide up-to-date layers of information on the military operatives of terrorist organizations. “This is not a list of confirmed military operatives eligible to attack.”

Lt Col Peter Werner, an IDF spokeswoman, condemned the coverage as bogus. On reports that the IDF allowed the murder of 15-20 civilians for a junior Hamas agent, he said on X: ”No Hamas individual was targeted with an expected 100 civilian casualties. No Hamas individual was automatically approved for attack with an expected 15-20 casualties.”

Exit mobile version