Reports & Researches

AI Plotted Geocide: How Corporations Facilitate Israels AI-enabled War on Gaza

Artificial Intelligence and emerging technologies invaded warfare since its early start. The use of artificial intelligence in armed conflict and in decision making on the battlefield raises alarming questions on its humanitarian consequences and the adherence to the basic international legal principles.

The use of artificial intelligence in the current war on Gaza concerns experts with reports indicating that "untested” and "undisclosed” technology is being employed.

In the following report, we delve into the dilemma of the use of artificial intelligence by Israel in the current war on Gaza, how the use of such technology facilitated the ongoing genocide, and the legal liability of corporations in enabling the means for international crimes.

Warfare Technology: the Gospel, Lavender, and Where's Daddy?

The reliance on AI and automation in warfare is not a factor of today. Israel claims that its first AI war was the 11-day war of May 2021 on Gaza.

Since the Israeli military launched its large-scale war on the Gaza strip in October 2023, attributes of use of artificial intelligence and the advance use of technology has been clear.

The use of AI systems in the current warfare has accelerated the speed and scale of targeting of supposedly "Gazan military leadership and premises”. Means of artificial intelligence allowed faster tracking of targets and damage estimation while presenting a major breach to basic IHL principles as a form of a dumb- mass assassination machine.

Recognized AI systems by Israel includes the Gospel, Lavendar, and Where’s Daddy.


The Gospel

The Gospel or "Habsora” in Hebrew, is an AI system that uses machine learning to interpret vast amounts of data and generate potential targets for military actions by the IDF.

Reports point out the terrifying fact that the Gospel unit focus is concentrated on the quantitate scale of targets compared to the qualitative aspect. It was in fact described by a former intelligence officer as a "mass assassination factory.”

The investigation by +972 confirmed from five different sources that the number of civilians that are likely to be killed in attacks on private residences is known in advance to Israeli intelligence under the category of "collateral damage.”


The Lavendar system is designed to mark suspected operatives in Hamas and Palestinian Islamic Jihad (PIJ). In the first few weeks of the war the machine identified around 37,000 Palestinians, and their homes, as "suspects", which could be targeted in air strikes even if family members were at home.

According to the investigation carried out by +972 Magazine and Local Call, intelligence officers pointed out that "Lavender has played a key role in the unprecedented bombing,” explaining the massive civilian death toll.

Reports indicate that during the early stages of the war, the IDF gave sweeping approval for officers to adopt Lavender’s kill lists, without requiring to thoroughly check the reason behind these choices or to examine the raw intelligence data on which they were based. One source stated that human personnel often served only as a "rubber stamp” for the machine’s decisions, adding that, normally, they would personally devote only about "20 seconds” to each target before authorizing a bombing — just to make sure the Lavender-marked target is a male. This is the case even though the system "errors” count to approximately 10 percent of generated cases and that is evident that it occasionally marks individuals who have merely a loose connection to militant groups, or no connection at all.


Where’s Daddy?

This automated system is used specifically to track the targeted individuals and carry out bombings when they have entered their family’s residences.

Sources indicated that each time the pace of assassinations waned, more targets were added to systems like Where’s Daddy? to locate individuals that entered their homes and could therefore be bombed. The decision of who to put into the tracking systems could be made by relatively low-ranking officers in the military hierarchy.


How do these systems work?

Targets are created using a "probabilistic interference” method of machine learning algorithms. The liability of the generated output depends on the quality and scale of the input data it processes. These systems look for patterns to generate predictions and suggestions on the likelihood of events or identifying status.

The error lies in the blind method itself. A civilian is easily mistaken for a combatant based on likelihood and similar characteristics.

Another system known as the "Fire Factory” is being deployed to organize and schedule raids and attacks on military approved targets.

Minimal due diligence in avoiding civilian casualties and damage to civilian structures, combined with the will to cause severe damage, transformed the Gaza strip into a massive graveyard.

The deployment of sophisticated artificial intelligence systems and automation counts for the massive death toll amongst civilians and civilian targets in the Gaza Strip.

The Target Administration Division formed in 2019 indicated clearly on its website that it uses the AI system Gospel to speed up the pace of target determination process.


The overall mass targeting, simultaneously, and within a short timeframe serves in deepening the shock effect between civilians to generate pressure on armed resistance groups in Gaza.

The proportionality principle in article 30 in the Geneva conventions is set to question. Israel claims that it is respecting this principle, but this depends again on the vague and general definition that proportionality has in international law and within international practice.

The emergence of numerous targets that these advancements make possible renders human oversight almost impossible.

The use of such technology handles the life-or-death actions to inhumane machines. The amount of data and information that AI systems analyze at a tremendous speed, manually required thousands of hours and hundreds of humans to analyze.

This takes us back to the original dilemma, Gaza is under Israeli occupation, and Israel has access and controls the flow of information across and beyond the Gaza Strip. Israel controls its airspace, electromagnetic space, communications, and its full territorial borders even if until the start of the military land invasion it had no military presence inside to the land of Gaza.

The unhindered access to data and information by Israel makes the Gaza Strip fully "exposed” and potentially accessible for automated target screening.


The Project Nimbus

The Project Nimbus is a 1.2 bn joint contract between Google, Amazon and the Israeli government. The deal signed in 2021 provides the Israeli government and its military with cloud computing infrastructure, artificial intelligence, and other technological services.

A report by the Intercept in 2021, highlighted that Google is offering advanced AI capabilities to Israel. This is used to enable Israel to harvest data for facial recognition, video analysis, sentiment analysis, and object tracking as part of the Project Nimbus.

A report by Brown University Professor, Roberto J Gondalez, marked the involvement of other tech companies with Israel including the US public company Palantir Technologies.

Google employees have been organizing protests and sit-ins led by No Tech for Apartheid against Project Nimbus since 2021.

With the disturbing reports of the reliance on AI in the current war on Gaza and the genocide charge for its war in the ICJ, the opposition of Google employees have re-escalated.


AI as a tool of genocide

Artificial intelligence unleashed new military action scopes, and the evidence is clear in the current war on Gaza.

By the time of drafting this report the death toll in Gaza has exceeded 37,000. The head of the UN's Mine Action Program, Mungo Birch, has indicated that the number of unexploded missiles and bombs lying under the rubble was "unprecedented" since World War II and that Gaza was now the site of about 37 million tons of rubble — more than what had been generated across all of Ukraine during Russia's war — and 800,000 tons of asbestos and other contaminants.

A report by the Wall Street Journal estimated that on average, Israel hit every square kilometer of Gaza with 79 bombs, munitions or shells.

The scale and scope of operations and the intensity of the military operation is alarming.

The UNOCHA estimated the death rate in Gaza exceeding 250 people per day, given the number of civilian deaths as of January.


Legal Liability of Corporations in enabling the means for international crimes

Corporations- as legal persons- should be accountable for committing grave violations including the crime of genocide.

The role of corporations in genocide is established from the criminal liability as enablers for criminal conduct. Indeed, corporations could be convicted as active participants in human rights violations and war crimes if proven to have engaged in direct combat.

The liability of corporation on actions constituting acts of genocide under international law is derived from the Convention on the Prevention and Punishment of the Convention on the Crime of Genocide of 1948.Although a ray of scholars argues whether corporations are subject to the Crime of Genocide, we refer to Article 4 of the convention, "Persons committing genocide or any of the other acts enumerated in article III shall be punished, whether they are constitutionally responsible rulers, public officials or private individuals.”

The term "persons” is legally defined during the period of drafting this convention in reference to both natural and moral/artificial persons, which includes corporations as a general rule.[1]

The liability of corporations presumes a standard of knowledge. With the absence of command responsibility in a corporate set-up, the awareness of the criminal attributes and purposes of manufactured and retailed products is essential to determine criminal responsibility of corporations or corporate accountability.

Although corporations are subject to the provisions of international law and specifically the concerned convention, the challenge for persecutors remain in proving the specific intent to commit genocide. Participation in the genocide can constitute complicity by knowingly aiding and procuring means that contribute to international crimes.

Despite the severity of the crime of genocide and the substantial responsibility corporate actors might have in facilitating and enabling its commitment, by the time of writing this report, there is no tangible way for companies to be investigated for genocide by an international tribunal.

Genocide as an international crime is prosecutable before the ICC and special tribunals including the ICTY, the ECCC, and the ICTR. However, the statutes of these courts do not imbody them with the jurisdiction to prosecute legal persons.

Although the situation might be in the lack of international criminal jurisdiction to prosecute corporate entities, this shall not free companies from their legal obligations.

The initial draft of the Rome Statute tended to include "legal persons, with the exception of states” but did not make it to the final statute.

On the other hand, the ICJ has no jurisdiction over criminal cases and cannot prosecute actors except states.

Available tribunal measures could be realized in the domestic prosecution only.

Both the Gospel and Lavendar were developed by the IDF signals intelligence branch known as "Unit 8200”.

Other outsourced technology could be subject to liability. Heavy deals like Project Nimbus raise alarms on the criminal liability of huge corporations like Google and Amazon.


Accessible Means of Support

In the absence of concise and accessible international judicial tracks to hold corporations accountable for facilitating the means and enabling genocide, the reliance on other means of advocacy becomes necessary.

Although addressing corporate liability through competent tribunals is not available, the resort could be to advocacy efforts that pressure corporations and influences their policy and economic behavior.

Internal employees' revolt movements that influence leadership decisions, push for dropping concerning contracts, and call for divestments are essential. Staff awareness and collective action against deals that hold substantial human rights concerns provides reasonable grounds for corporate management to reconsider contracts and withdraw investments. Employees' dissatisfaction and unproductive environments tend to generate more monetary and strategic losses for corporations than any promised profits these deals might generate.

External organized advocacy that calls for divestments, boycotting products, and impeding the public image of these private actors also promises considerable impact.

As far as corporations are left unbothered, and in the absence of measurable legal interventions, nothing could withhold private actors from involving in relationships that constitute grounds for grave human rights violations.

United efforts that shed light on the establishing factors for crimes and the involvement of corporate entities in their material element is the founding point to end this participation.

Other measures could be established though collective national lobby campaigns and pressure on national governments to work on adopting adequate international legal provisions, establishing national corporate liability for enabling genocide, setting sanctions, and prosecuting corporations in front of its national courts.




Huge tech companies, although not directly engaging in the actual criminal act, play a significant role as enablers for the commitment of such crimes.

Corporate liability is intensified in the case of clear criminal intent. Willingly providing systems and software for a legally recognized apartheid state that is questioned for severe crimes, while knowingly recognizing the disturbing use of such technology in warfare is enough to address corporate liability.

Holding corporations accountable for international crimes should not be a matter of debate. The real obstacle remains in how the absence of active judicial measures diminishes the sense of corporate responsibility.

Not engaging in severe international crimes, should not rely only on companies fear for preserving their public image, but rather on strict legal attributes.



We, at the Palestinian Association for Human Rights (Witness), condemn the unlawful killing of civilians and the destruction of the means of life in Gaza. In this regard, we warn of the devastating impacts of the use of automation and artificial intelligence in warfare.

We stand firm against the active and passive participation in the active genocide imposed on the people of Gaza, and we condemn all actions that enable the massive destruction of civilian lives and civilian populations whether by states or non-state entities.

With that being said, we call all concerned corporations and private actors to withhold their contributions and supply of technology and services to the Israeli government that are deployed and used as mass killing machines. Divestments are necessary to ensure the genocide is not moving any further, and so is the end to the military and intelligence supply; to ensure compliance with the basic human rights and international humanitarian law principles.

26 June 2024

The Palestinian Association for Human Rights (Witness)

[1] In the period of drafting the Genocide convention 1946-1948, and in the absence of a contradicting text, the definition of "persons” -in principle- recognizes natural persons in addition to artificial persons including corporations as subjects of international law .