- Sputnik International, 1920
World
Get the latest news from around the world, live coverage, off-beat stories, features and analysis.

Israel Uses Military AI in Gaza: Tool of Genocide or 'Simply a Database'?

© AFP 2023A man walks past the rubble of a destroyed building in Khan Yunis
A man walks past the rubble of a destroyed building in Khan Yunis - Sputnik International, 1920, 09.04.2024
Subscribe
Tel Aviv's special services and the army have been letting an AI-based machine decide on the life and death of tens of thousands of people for several months, according to a recent investigation based on interviews with the Israeli military.
They were the Gazans whom “the machine” suspected of being jihadists or their relatives. Those whom the AI program codenamed Lavender found suspicious enough were blown up together with the buildings they were in – with huge human “collateral damage.”

“As we see, the Lavender AI program or some other targeting program the Israelis are using – its purpose is not to destroy enemies – the terrorists, Hamas operatives, etc. Instead, it has been designed to destroy blocks of real estate… In the worst case, it is a bloodthirsty confirmation of the worst kind of secret genocide. The Israeli government is trying to erase the Palestinian people and all memory of Gaza like chalk drawings from the blackboard,” former US State Department analyst Scott Bennett told Sputnik.

In several interviews with international media, journalist Yuval Abraham confirmed the worst suspicions about the program.

“What Lavender does is scan the information on the people inside a given building in Gaza. The artificial intelligence, i.e. a machine, gives every individual a rating (on a scale from 1 to 100) on the likelihood that this individual is a member of Hamas or the military wing of Islamic Jihad group,” he explained in an interview with Manhattan-based alternative news program Democracy Now.

In the next step, the machine ordered the destruction of the building with people inside it, including children. There was minimal human supervision of AI’s actions – one of my sources said he spent just 20 seconds before authorizing the bombing,” Abraham added.
While his investigation cited six high-ranking Israeli officers acquainted with AI use, the Israeli Defense Forces (IDF) is yet to comment on the accusations. An IDF statement did not deny the use of AI in the identification of Israel’s human targets.
US President Joe Biden shakes hands with Israeli Prime Minister Benjamin Netanyahu as they meet on the sidelines of the 78th United Nations General Assembly in New York City on September 20, 2023.  - Sputnik International, 1920, 06.04.2024
Analysis
Strike on Aid Workers in Gaza: 'US Knows How to Be Silent' on Israel's Violations – Analyst
“The system your questions refer to is not a system, but simply a database whose purpose is to cross-reference intelligence sources in order to produce up-to-date layers of information on the military operatives of terrorist organizations. This is not a list of confirmed military operatives eligible to attack,” the IDF said in a statement published in response to questions from the British daily The Guardian.
Reham Owda, an independent political analyst from Gaza, shared her opinion with Sputnik: “This means that Israel chose the easiest and the cheapest way to target Hamas militants in Gaza without considering the lives of innocent civilians, including women and children. This means that Israel deals with Gaza civilians as if they were numbers or just items that can be removed randomly.”

How Many People Have Been Killed by AI in Gaza?

In his interview to Democracy Now, Abraham revealed the grim statistics of Lavender’s operations:
The military knew that approximately 10 percent of the people whom the machine marked to be killed – one-tenth of them were not Hamas militants,” Abraham said in the interview.
“Some of them had a loose connection to Hamas, some had completely no connection. One source told me how the machine marked people who had the same name and nickname as another person, a Hamas operative… The military had what they called a collateral damage degree. Another source said that up to 20 innocent civilians could be killed in order to liquidate one low-ranking Hamas activist. For high-ranking officers, say, a Hamas brigade commander, the number could be in triple digits – for the first time in Israeli Defense Force’s history.”
A general view shows the damage in the area surrounding Gaza's Al-Shifa hospital after the Israeli military withdrew from the complex housing the hospital on April 1, 2024 - Sputnik International, 1920, 07.04.2024
Analysis
Even With AI 'Kill List', Israel 'No Closer to Achieving' Goals Six Months Into Gaza War

“According to Abraham’s reporting, Israeli officials did not want to 'waste’ more expensive precision-guided munitions on the many junior-level Hamas operatives identified by Lavender program,” The Washington Post wrote. So, “dumb” bombs were used, which the WP describes as “heavy, unguided weapons that inflicted significant loss of civilian life.”

The following question is inevitable: who is going to be held responsible for the use of an Israeli AI machine that, according to the officers interviewed by Abraham, marked 37,000 people in Gaza as targets for annihilation?
Essentially, it is not the machine, but the human being that imagines, designs, creates and then triggers such a machine into action – that person is responsible for everything that results,” Bennett noted.
Meanwhile, Owda told Sputnik that ultimate responsibility lies with Israel’s special services.

“Military Intelligence Directorate (Aman) and Israel’s Internal Security Service Shabak are the two security directorates that are responsible for the strikes made with the use of AI. Because their members are responsible for the data, loaded into [IDF’s] computer system… And that system included classification of each Palestinian person by his or her age, education, profession, etc.”

The matter is so serious that the UN Secretary-General Antonio Guterres expressed “serious concern” over reports that Israel used AI in Gaza in a special statement – at least, in the first weeks of Israel’s punitive operation there.
"No part of life and death decisions which impact entire families should be delegated to the cold calculation of algorithms," Guterres is quoted by international news agencies as saying.
Newsfeed
0
To participate in the discussion
log in or register
loader
Chats
Заголовок открываемого материала