Israel-Palestine war: How the AI 'Habsora' system masks random killing with maths
Israel's war on Gaza has seen it strike the Palestinian enclave with new and deadly ferocity. That onslaught, according to a recent report, is being powered with an artificial intelligence system that experts warn is indiscriminate and inherently faulty.
In a joint investigation, Israeli outlets +972 Magazine and Local Call conducted interviews with several former and current Israeli intelligence officials, revealing that the army had lower expectations than previously on limiting civilian targets.
Loosened rules were combined with the use of "Habsora" ("The Gospel" in Hebrew), an AI system that can generate targets at faster rates than before, facilitating what one former intelligence officer called a "mass assassination factory".
Officials admitted to the outlets that the homes of lower-ranked members of Hamas and other Palestinian armed factions were purposefully targeted, even if it meant killing everyone in the building.
One case saw the Israeli army intelligence approve the killing of hundreds of Palestinians to assassinate a single Hamas member.
"This is the first time that they're talking about how civilians are being targeted at scale just for hitting one military target based on AI technology," Anwar Mhajne, professor of political science at Stonehill College, in Massachusetts, told Middle East Eye.
'Not feasible at all'
When discussing the Habsora system, one source told the Israeli outlets that the focus is on quantity, not quality. They added that while a human eye will review the targets before each attack, not much time is needed for it.
Mhajne said: "If you're waging war at a scale like the one you're waging right now on Gaza, how much can you review?"
An Israeli expert in military use of AI, who spoke to MEE on condition of anonymity, said that having a human review every AI-generated target in Gaza is "not feasible at all".
He added that the algorithm does not explain how it reaches its conclusions, making it difficult to check the validity of a strike's outcome.
Follow Middle East Eye's live coverage for the latest on the Israel-Palestine war
With Israel estimating that Gaza holds about 30,000 Hamas members, experts worry about the mass civilian casualties that can be caused by relying on those systems.
The Israeli military reportedly believes it has killed 1,000-2,000 Hamas members in Gaza since 7 October. More than 15,000 Palestinians have been killed in that time, including at least 6,150 children.
"We're speaking about thousands of civilians who have been killed [due to the use] of such technology," Mona Shtaya, non-resident fellow at the Washington-based Tahrir Institute for Middle East Policy, told MEE.
'A bigger surveillance system'
According to Shtaya, Israel's use of AI as a military and surveillance tool is not new, nor is it unexpected.
"AI is part of a bigger surveillance system, where Palestinians are living under constant surveillance," she said.
In 2021, a Washington Post investigation revealed that Israeli soldiers used an extensive facial recognition programme to enhance their surveillance of Palestinians in the occupied West Bank city of Hebron. The army also set up face-scanning cameras across the city "to help soldiers at checkpoints identify Palestinians even before they present their ID cards".
That same year, Amazon Web Service and Google signed a $1.2bn deal with the Israel government known as Project Nimbus. Employees at both these companies warned that this cloud service "allows for further surveillance of and unlawful data collection on Palestinians, and facilitates expansion of Israel's illegal settlements on Palestinian land".
Israel has also reportedly used AI in its previous major offensive in Gaza in 2021, in what it called "the world's first AI war". During this 11-day battle, drones reportedly killed civilians, damaged schools and medical clinics and levelled high-rise buildings.
Now, more-developed systems are employed in the war in Gaza to go as far as predicting the number of civilian casualties a strike would cause.
"Nothing happens by accident," one source told +972 Magazine and Local Call. "When a three-year-old girl is killed in a home in Gaza, it's because someone in the army decided it wasn't a big deal for her to be killed - that it was a price worth paying in order to hit [another] target. We are not Hamas. These are not random rockets. Everything is intentional. We know exactly how much collateral damage there is in every home."
'They have a testing ground'
The present war started when Hamas-led Palestinian factions launched an assault on Israel, killing more than 1,200 Israelis and taking around 240 people captive. Israel responded by heavily bombing the Gaza Strip and invading the coastal enclave, destroying much of the civilian infrastructure in the process.
Sources in the investigation said they believe the widespread killing and destruction could be used to give the Israeli public an image of victory. Mhajne believes this goal can be expanded to the image of Israeli technology.
"The Hamas attacks showed the weaknesses of AI when it comes to surveillance," she said.
According to her, Hamas's ability to break into Israel unnoticed after its fighters dismantled the signal towers around the Gaza Strip caused severe reputational damage.
Israeli spyware technology has been particularly used across many countries to target journalists and activists.
Israel is also the world's 10th largest weapons exporter, with a particularly strong reputation for cybersecurity and AI weaponry.
"They test stuff on Palestinians. That is why Israel is leading when it comes to the development of cybersecurity and AI, because they have a testing ground," Mhajne said.
"Nobody is talking to them about how they're developing it and how they're testing it. I guarantee you that this technology, after the war, is going to be sold to every repressive regime that you know."
Shtaya agreed, saying AI warfare technology such as Habsora is "just used to impress, and make their job easier in destroying the Gaza Strip".
While this system remains strictly in the Israeli military's hands at present, the Israeli expert believes that will change.
"In the future, people who work there will go out to the private sector and make similar things and export them, for sure," he said, claiming Israeli arms sales have already skyrocketed. "This war already is great for the Israeli arms dealers and exports."
'There is no limit'
While many are calling for Israel to be held accountable for its actions in Gaza, with UN bodies warning they could lead to accusations of war crimes and genocide, holding it accountable for its use of AI may be more complicated.
While some governments and international organisations regulate the use of AI for military means by saying it should remain within the boundaries of international law, there are few to no AI-specific regulations relating to warfare.
Additionally, Israel so far shows no sign of regulating its use of this new technology, even if it means killing more civilians.
"Because Israel views Hamas now as an existential threat, there is no limit," the Israeli expert told MEE, suggesting it may go as far as killing Israeli captives if it means reaching Hamas's top commanders.
"AI for sure is giving the army an illusion of mathematical precision and analyses, which is false," he said. "All the human flaws that the algorithm learned from are automatic there."
The International Committee of the Red Cross believes AI can be a tool that enables better decisions in conflicts and helps to avoid civilian casualties. Shtaya also believes these technological advancements, when used correctly, can generally improve people's quality of life.
"It is painful and devastating to see this kind of technology used by the state to oppress people and make their lives harder, just to have this collective punishment," she said.
This article is available in French on Middle East Eye French edition.
Middle East Eye propose une couverture et une analyse indépendantes et incomparables du Moyen-Orient, de l’Afrique du Nord et d’autres régions du monde. Pour en savoir plus sur la reprise de ce contenu et les frais qui s’appliquent, veuillez remplir ce formulaire [en anglais]. Pour en savoir plus sur MEE, cliquez ici [en anglais].