Onderwerp: Bezoek-historie

Israel’s use of Military AI-systems in the occupied territories of Palestine

Dit onderwerp bevat de volgende rubrieken.

Bijdrage - Opinie

Israel's use of Military AI-systems in the occupied territories of Palestine

By A. El Johari LLM

1 Introduction

The increasing use of Artificial Intelligence (AI) in military and surveillance applications raises serious concerns under international law, particularly in conflict zones. One context where these concerns have become especially pressing is the Israeli use of AI in the occupation of Palestine. Israel has implemented AI-driven technologies in its military operations in the occupied territories of Palestine, including facial recognition 1 and data-driven airstrikes.2 Following Hamas' attack on October 7, 2023, AI-powered military technologies have been reportedly deployed in the Gaza war.

This article asks whether Israel's use of AI-driven technologies in the occupied Palestinian territories complies (OPT) with international legal standards. More specifically, it examines two prominent AI systems that are deployed in the OPT: The Lavender system, which generates human targets for aerial bombardment based on AI-enabled decision-support systems (AI-DSS), and the Blue Wolf facial recognition system, used to identify and monitor Palestinians at checkpoints and through widespread surveillance. These technologies are assessed through two legal lenses: Lavender under international humanitarian law (IHL); and Blue Wolf under international human rights law (IHRL). While IHL and IHRL often apply concurrently in situations of occupation, this article analyzes each AI system through the legal framework most directly engaged by its function. Lavender is analyzed with an IHL lens, given its role in hostilities and the conduct of attacks. In contrast, Blue Wolf is assessed under IHRL, as it operates within the civilian sphere.

To answer these questions, this article will consist of five parts. The first part outlines a factual framework of the two prominent AI technologies deployed by Israel in Palestinian occupied territory: the Lavender targeting system and the Blue Wolf facial recognition system. The second part will outline the legal framework on military occupation, focusing on the application of IHL and IHRL. The third part of this article contains an analysis of the legal implications regarding Lavender under IHL, focusing on the principles of distinction, proportionality, and precautions in attack. This includes a discussion of the legal review established in Article 36 of Additional Protocol I. The fourth part provides an analysis of Blue Wolf under IHRL, particularly in the context of the right to privacy. Finally, the article will reflect on legal gaps and conclude with some observations that address the vital need for stronger international regulation of AI-enabled military technologies.

2 Israel's use of Artificial Intelligence in Palestine

2.1 Automated target generation systems

Several reports have emerged showing that Israel has been using AI systems such as Lavender and Gospel to generate targets for aerial bombardment in Gaza since late 2023. According to the IDF, these systems create targets on infrastructure that is 'linked to Hamas'. The Gospel would also generate a collateral damage score that stipulates how many civilians were likely to be killed in a strike. Lavender generates individual human targets for assassination. Suspected operatives of Hamas and Palestinian Islamic Jihad are marked, including the low-ranking ones, as potential bombing targets. Lavender is commonly used with additional automated systems like 'Where's Daddy?' to track targeted individuals and carry out bombings when they enter their family's residences.3

The Israeli-Palestinian +972 Magazine revealed that Israeli intelligence officers invest very limited time in verifying targets generated by these systems: 'I would invest 20 seconds for each target at this stage, and do dozens of them every day. I had zero added-value as a human, apart from being a stamp of approval. It saved a lot of time.' In the same article, sources said that Lavender marked civilians as fighters in error and that there was no supervisory mechanism in place to detect the mistake. According to B.: 'If the target gave his phone to his son, his older brother, or just a random man. That person will be bombed in his house with his family. This happened often. These were most of the mistakes caused by Lavender.'4

2.2 Facial recognition systems and biometric surveillance

Israel had expanded its biometric surveillance of Palestinians to Gaza, while this technology was already being deployed in East Jerusalem and the West Bank. It is reported that these technologies are being used to conduct mass surveillance, by collecting and cataloging the faces of Palestinians without their knowledge or consent. 5 In the West Bank and East Jerusalem, a facial recognition system called Blue Wolf is being used. This system scans Palestinians through high-resolution cameras before they are allowed to pass. By scanning these faces, the individuals are added to a database. The Israeli military can access the information collected on Palestinian individuals and stored in the 'Wolf Pack database' by using a mobile phone app. 6 In Gaza the technology is being used to identify people for hit lists. 7

3 Occupation and accountability under International Law

3.1 The legal definition of an occupation

Article 42 of the 1907 Hague Regulations (HR) states that a 'territory is considered occupied when it is actually placed under the authority of the hostile army. The occupation extends only to the territory where such authority has been established and can be exercised'. The HR are an important source to determine the legal framework concerning the occupation of another state by a hostile army. In general, as will be explained in this section, occupation is governed by International Humanitarian Law. The Hague Regulations and the Fourth Geneva Convention contain important provisions in relation to occupation. An important note regarding occupation in general is that the situation is seen as temporary and the Occupying Power cannot behave as if the people of the occupied territory were its own subjects.8

The legal framework around an occupation is formed by IHL and IHRL. Both interact with each other. This relationship is especially important when the applicability of international humanitarian law is being contested or where there is an exception that can be applied to a certain right or when IHL does not provide sufficient remedies for violations. 9 Where the two regimes overlap and there is a conflict between an IHL rule and a human rights rule, the principle of lex specialis requires a case-by-case assessment to determine which rule should apply in specific circumstances.10

3.2 Occupation under International Humanitarian Law

IHL is also known as the law of war. IHL only applies in the case of an armed conflict. Armed conflicts can be categorized as international and non-international armed conflicts, each with their own rules.11 However, many core rules applicable to international armed conflicts also apply to non-international armed conflicts as customary law. According to common Article 2 of the 1949 Geneva Conventions, the Conventions also apply in case of partial or total occupation, even when the occupation is met with no armed resistance.12

There are several rules of IHL that are specifically relevant during an occupation. One of these rules, as explained before, is that the Occupying Power cannot acquire sovereignty over the occupied territory. Occupation is also regarded as a temporary situation in which the rights of the occupant are limited to the extent of that period.13 Inhabitants do not lose their nationality, and they do not owe allegiance to the Occupying Power.14 Protected persons cannot be subjected to physical or moral coercion and the Occupying Power may not compel protected persons to serve in its armed or auxiliary forces.15 The Occupying Power is also responsible for the treatment of protected persons by its agents.16 Deportations and transfers of its own civilian population into the territory it occupies are forbidden.17 It is also not allowed for the Occupying Power to impose collective penalties.18 The aforementioned non-exhaustive list of rules of IHL come mainly from the fourth Geneva Convention and The Hague Regulations. Israel ratified the Geneva Conventions on July 6, 1951. It did not sign or ratify the 1907 Hague Conventions. The Israeli High Court however has found that the Regulations are a part of customary international law and that they are binding on all states.19

Israel has commonly taken the position that it is not bound by the Fourth Geneva Convention regarding the occupation of the OPT. 20 It takes the position that it will voluntarily abide by the 'humanitarian provisions' of the Fourth Geneva Convention. The international community also views the Geneva Convention to be applicable in this situation as reflected by multiple resolutions of the U.N. Security Council, the General Assembly and advisory opinions from the ICJ. 21 The Government of Israel had adopted the position that the status of Gaza and the West Bank is unclear, because it did not have a prior legitimate sovereign.22 According to the Government of Israel the laws of occupation are not applicable in situations where there was no sovereign power in the occupied territory or where the displaced power was not the lawful sovereign, therefore, the applicability of the Fourth Geneva Convention is questionable according to Israel.23

As a result of this ideological line, Israel founded the Levy Committee which should explore legalizing Israeli settlements in the occupied territories. The committee came to the same assessment as the Israeli government that the law of occupation does not apply to the special historical and legal circumstances of the Israeli presence in Judea and Samaria.24 This means that Palestinian territory is considered terra nullius, allowing colonial conquest.25 The Israeli Attorney General published a position paper in 2019 which, in a deviation from the previous line, states that Palestinian sovereignty is in "abeyance.'26

3.3 Occupation under International Human Rights Law

IHL has very different foundations than human rights, because human rights are primarily designed for peacetime. Under human rights the right to life cannot be compromised, except in exceptional circumstances, such as the right to self-defence and lawful arrest.27 In practice, other differences can be noted between IHL and International Human Rights Law (IHRL). For example, IHL primarily focuses on states in international armed conflicts and on non-state actors in non-international armed conflicts, whereas IHRL applies to all parties to an armed conflict. Another major difference is that IHL provisions are not supported by international mechanisms of compliance.28

An important distinction between IHL and IHRL lies in the availability of enforcement mechanisms for individuals. Under IHRL, individuals in many cases have locus standi- they can bring claims before international bodies such as the Human Rights Committee (HRC), the European Court of Human Rights (ECtHR), or the Inter-American Court of Human Rights (IACtHR), provided that the state concerned is a party to the relevant treaties. In contrast, IHL does not provide individuals with a direct avenue for legal complaints or redress before international courts. This difference reinforces the complementary role of IHRL in situations of occupation, as it allows individuals to seek accountability for rights violations where IHL mechanisms fall short.29

The legal consequences of an occupation under IHL apply to the territory over which the occupying power exercises effective control. Separately, under IHRL, the exercise of effective control over a territory or population may also establish jurisdiction, thereby triggering the extraterritorial application of human rights obligations. While both legal frameworks can apply concurrently, the criteria for their application are distinct.30

IHRL is generally broader in scope than IHL, as it applies in both peace and armed conflict. However, its extraterritorial application depends on the existence of effective control—either over territory or over individuals. In international armed conflicts, IHL applies automatically from the outset of hostilities. However, the specific rules governing military occupation only apply once a state exercises effective control over foreign territory. Once effective control is established, IHRL and IHL both apply concurrently and can complement each other. IHRL can help fill normative gaps, particularly in the protection of individuals who may fall outside IHL's definition of protected persons. The Geneva Conventions, for example, do not extend protection to everyone in an occupied territory. Examples of non-protected groups are nationals of the Occupying Power and nationals of co-belligerent states.

4 The legal framework around military Artificial Intelligence

4.1 Lavender under International Humanitarian Law

Lavender functions as an AI-DSS that identifies potential human targets for airstrikes. While not a weapon in itself, its integration into targeting processes makes it subject to IHL rules applicable to methods of warfare. Military AI-DSS are not specifically regulated in IHL treaties. However, their use must comply with the existing IHL framework. Responsibility for ensuring that these systems are used in accordance with IHL rests with the state that is deploying and using these systems. Unlike autonomous weapons, AI-DSS like Lavender do not execute force directly but instead support human decision-makers. Nevertheless, they can profoundly influence targeting decisions and raise distinct legal concerns, particularly around the fundamental principles of distinction, proportionality and precautions in attack. These rules mean the following: the operator of a military AI-DSS must be able to distinguish between military objectives and civilian objects, combatants and civilians. The duty to take precautions means constant care needs to be taken to spare the civilian population, civilians and civilian objects. Legal assessments such as proportionality require context-sensitive, qualitative and value-laden reasoning, which DSS systems- based on pattern recognition and correlation - are ill-suited to provide.31 In addition, the attack must be stopped if it becomes clear that the target is not a military objective, has special protection or if the attack is expected to violate the rules of proportionality. As Woodcock notes, DSS systems do not simply support legal reasoning-they actively shape it by filtering, organizing, and weighing inputs, often in ways that obscure legal nuance. This means that even if a human operator remains formally in control, the system can influence how proportionality or distinction is understood in practice.32 Additionally, recent reports on Israel's Lavender system show that human oversight may be reduced to a mere formality. In practice, commanders were reportedly approving AI-generated targets within seconds, without meaningful review. Such practices raise serious concerns under IHL, particularly in relation to the obligation to take all feasible precautions to verify that a target is a military objective before launching an attack.33 These concerns are amplified by the operational logic behind AI-DSS systems like Lavender, which are not only designed for precision, but also for speed. As noted in recent commentary, the 'need for speed' in targeting risks marginalizing legal checks and reducing the time available for meaningful human assessment of proportionality and precaution.34

Introducing new methods of warfare to the battlefield is subject to all sorts of rules. For example, new means and methods of warfare must be legally reviewed. Article 36 of Additional Protocol I of 1977 states, as regards legal reviews of new weapons, means and methods of warfare: 'In the study, development, acquisition or adoption of a new weapon, means or method of warfare, a High Contracting Party is under an obligation to determine whether its employment would, in some or all circumstances, be prohibited by this protocol or by any other rule of international law applicable to the High Contracting Party.'

Article 36 does not state how the legality of weapons, means and methods of warfare can be determined. It does state however that all new weapons, means or methods of warfare should be assessed in the light of Additional Protocol I and all other applicable rules of international law. In 2023, the UK parliament appointed a committee of experts to look into the use of AI in weapons. During these hearings, Article 36 of Additional Protocol I was discussed at length. Experts from this committee noted that only a small number of states are able to conduct Article 36 reviews, because of the need for empirical testing.35 There is no oversight in this process by the international community on how the reviews are conducted and states are not obliged to share the results of the reviews.36 Effective testing of AI-enabled military systems may be impossible. Professor Stuart Russel, Professor of Computer Science, University of California stated the following:

'There are difficulties in testing for discrimination, but proportionality and necessity are things that are so context-specific and dependent on aspects of the overall military situation that it would be very difficult not only to design an AI system that could make that judgment reliably, but to develop any kind of testing in the lab for those conditions. I am not sure how you would design situations that are fully representative of the kinds of situations that could occur in the field, where there are difficult judgments to make.'37

As scholars have noted, the legal review of AI-enabled systems under Article 36 of Additional Protocol I requires reinterpretation and expansion. While Lavender is not an autonomous weapon system, it plays a critical role in the 'co-production of hostilities' by generating targeting recommendations, and must therefore be subject to the same scrutiny. Copeland, Liivoja, and Sanders argue that Article 36 reviews must go beyond static assessments and be applied iteratively across the system's lifecycle, especially where machine learning may affect performance over time. They also stress that the absence of transparency and shared review standards among states significantly undermines the legitimacy of such reviews.38 In the case of Lavender, there is no indication that its evolving decision-making process—nor the human-machine interaction it entails—has been subject to an adequate, ongoing legal assessment in line with these recommendations.

4.2 Blue Wolf under International Human Rights Law

The use of facial recognition technology and biometric surveillance touches upon provisions of International Human Rights law such as the right to privacy. Article 17 of the ICCPR provides that 'No one shall be subjected to arbitrary or unlawful interference with his privacy, family, home or correspondence, nor to unlawful attacks on his honour and reputation', and that 'Everyone has the right to the protection of the law against such interference or attacks'. It is unclear to what extent using public information constitutes an invasion of privacy within the meaning of Article 17. There has been much debate about this.39 In several judgments, like Peck v. The United Kingdom, the ECtHR has found that the right to private life extends to public information.40 The ECtHR's decision in Glukhin v. Russia underscores the necessity for legal clarity and proportionality in the use of facial recognition technologies. The Court held that employing such intrusive surveillance methods against peaceful protesters, without adequate legal safeguards, violates fundamental human rights. This precedent is pertinent when evaluating systems like Blue Wolf, where similar concerns about legality, necessity, and proportionality arise.41 For an Occupying Power to be able to use facial recognition technology and biometric surveillance it needs to comply with the cumulative requirements of legality, legitimate aims, necessity and proportionality. 42 Legality refers to the fact that national laws 'must be sufficiently accessible, clear and precise so that an individual may look to the law and ascertain who is authorized to conduct data surveillance and under what circumstances'.43 An Occupying Power is generally required to respect the existing laws in the occupied territory. However, under Article 43 of the 1907 Hague Regulations and Article 64 of the Fourth Geneva Convention, it may enact new legislation when this is necessary to maintain public order and civil life or to fulfill its obligations under international law. This includes measures that may restrict rights such as privacy, but only within strict limits and for legitimate purposes, even when a new military order is issued.44 When it comes to legitimate aims, Article 17 of the ICCPR does not specify when the right to privacy can be limited. It is possible that the limitations can be found in other articles of the ICCPR such as the freedom of movement in Article 12. General Comment 27 of the Human Rights Committee, which discusses this article, states that limitation is possible 'to protect national security, public order, public health or morals and the rights and freedoms of others'. The Geneva Conventions also offer the Occupying Power the ability to deploy 'measures of control and security'. The Occupying Power could substantiate a legitimate goal by wanting to protect its own legitimate security concerns. However, there are reports that Israel is using its surveillance practices for political persecution.45 When it comes to necessity and proportionality, this means that the limitations should be 'necessary in a democratic society'.46 A balance must be found between pressing social needs and the interference with the right to privacy.47 The limitations need to be appropriate to achieve their function and the least intrusive instrument which might achieve the desired result must be chosen.48 In the case of surveillance technologies, this means that it is not enough if the measures are applied to find certain needles in a haystack.49 Mass surveillance has been assessed by several UN special procedures and constitute a 'potentially disproportionate interference with the right to privacy'.50 The scanning of people's faces in a public space, whether they are on a watchlist or not, may be considered 'indiscriminate mass surveillance.'51 Some deem LFR systems to be 'inherently disproportionate'.52

While this article primarily assesses Blue Wolf through the lens of IHRL, its use in OPT may also raise concerns under the Fourth Geneva Convention. It is possible for an Occupying Power to use FRTs to protect the security of the local population. FRTs could also be used within the 'measures of control and security' that is stipulated in Article 27 of the Fourth Geneva Convention. When they are being used at checkpoints within occupied territories they can be used to identify people that are security risks and in surveillance it can help identify people that are threats. However, it should be noted that the protection of dignity and humane treatment under Article 27 of the Fourth Geneva Convention could also include digital privacy rights.53 The pervasive use of FRTs in the OPT, particularly when applied to entire populations regardless of suspicion, risks violating this provision by subjecting protected persons to ongoing forms of public intimidation. According to the ICRC, the deployment of FRTs can lead to arrest, being targeted, and facing ill treatment. While any tool can theoretically be misused, the concern of FRTs lies in their systematic potential to normalize and automate repression, amplifying existing patterns of abuse rather than merely reflecting isolated misuse.54

5 Conclusions: applying the framework to the Israeli use of Military AI in occupied territories of Palestine

This conclusion offers (1) a summary of the findings, (2) reflection on legal gaps and brief thoughts on future regulatory pathways.

5.1 Summary of the legal findings

When the legal framework surrounding AI-driven target generation systems is applied to their actual use by Israel, several implications emerge. One is that under IHL, the use of such systems must meet the requirements of distinction and proportionality. In addition, the user must take precautions before an attack is carried out. Several testimonies from Israeli intelligence officers indicate that they have limited time to select targets through the automated target generation systems. One officer even shared that he only had 20 seconds for each target to determine if it was a legitimate target and that this human 'oversight' had no added value. Partly because of this, according to sources, officers using Lavender made mistakes and civilians were wrongly designated as military targets.

With this, Israel does not appear to meet the requirements of distinction and proportionality. The use of AI-DSS like Lavender without effective human oversight is likely to violate the principle of precaution under IHL. This means that there is not enough effort to actually distinguish military targets from civilian targets and to minimize damage to civilian targets.

For an Occupying Power to be able to use facial recognition technology and biometric surveillance it needs to comply with the cumulative requirements of legality, legitimate aims, necessity and proportionality. According to reports, Israel is using surveillance technologies such as Blue Wolf for political persecution. This would not meet the criterion of "legitimate aim" because it is plausible that it must be to protect national security, public order, public health or morals and the rights and freedoms of others. Because the Blue Wolf is being used for mass surveillance, this provides a potential disproportionate interference with the right to privacy. This would also fail to meet the criteria of necessity and proportionality.

5.2 Regulatory gaps and future regulation

I encountered several "gaps" in international law regarding military AI. The first one I would like to point out is the lack of transparency in the development and deployment of AI-driven military technologies. There are actually quite few 'hard' rules associated with this, other than that the state must conduct its own testing for which there is no framework. To increase transparency, there needs to be an internationally recognized standard for developing and using methods of warfare. In doing so, it is also possible to ban high-risk military AI. Europe is the pioneer in developing legislation around AI and the most recent AI act also categorizes different high-risk AI groups. The same could be done with military AI systems.

AI-supported decision-making systems pose growing legal and ethical challenges when deployed in contexts of occupation and armed conflict. As this article has shown, technologies like Lavender and Blue Wolf raise serious concerns about compliance with the most fundamental protections afforded under IHL and IHRL. While these legal frameworks remain applicable, they are currently ill-equipped to handle the unique features of AI technologies, particularly in terms of speed, scale, opacity, and autonomy.

There is an urgent need to revisit and expand the regulatory scope of Article 36 reviews, to establish internationally recognized standards for meaningful human control, and to create mechanisms for external oversight and public transparency. Drawing from models such as the EU AI Act, states and international bodies should work toward categorizing high-risk AI systems—including those used in warfare—as subject to stricter legal obligations or outright prohibitions. Without such reforms, foundational legal protections risk being eroded under the guise of technological innovation and military efficiency.

Naar boven