Israel, AI and its Matrix of Control
On the Application of AI to Enhance the Acquisition of Outgroup Male Targets in Occupied Palestinian Territories
By Kinan Imseis, University of Toronto
Research Fellow, Clay-Gilmore Institute for Philosophy, Technology, and Counterinsurgency
It is both historically and contemporarily evident that the use and testing of weaponry is often done at the expense of nonwhite populations. The scale may vary, but the directive remains the same. Nonwhites are not treated with the same level of dignity, humanity or privacy that is granted often to Western, predominantly white countries. No better contemporary case exists than Israel and the Occupied Palestinian Territories (O.P.T). A staple of the Israeli occupying state has long been its military capacity. For Israel, few industries in its economy have been at such highly consistent demand. This demand has allowed them to become global leaders within the broader military industrial complex and by extension exercise a level of control over the way war is waged globally. Jeff Halper in his 2015 book “War Against the People”describes this in what he refers to as the “global matrix of control”. This notion is a play on his earlier idea for what he refers to as the Israeli “matrix of control” on Palestinians. On this “matrix” he writes
“The Occupied Palestinian Territory has been transformed into probably the most monitored, controlled and militarized place on earth. It epitomizes the dream of every general, security expert and police officer to be able to exercise total biopolitical control. In a situation where the local population enjoys no effective legal protections or privacy, they and their lands become a laboratory where the latest technologies of surveillance, control and suppression are perfected and showcased, giving Israel an edge in the highly competitive global market.” (Halper, 2015).
Halper later takes this example of the “laboratory of Palestine” and the level of authority enjoyed by Israel within the O.P.T and expands it onto the world. These techniques and technologies are used in the O.P.T, stamped with Israeli seals of approval like “Tested in Gaza” or “Approved by the IDF” and are exported into the global market, wherein they become embedded into a variety of security systems and warfare. This dynamic ensures that Israeli military technology is a core element of security campaigns across the globe, whether military or municipal. This is characteristic of a global shift towards securitization as the primary objective, every facet of the lives of people globally are hyper scrutinized to ensure “maximum security”. In a world wrought by technology and surveillance, Israel is able to be at the heart of the operation, with their backyard laboratory that is the O.P.T (Halper, 2015). This cycle is endlessly fulfilling; Israel has no impetusto quell any resistance as it continually generates them economic development and global authority. Halper further describes this: “The fact that Israel has failed miserably in both combating terror and resolving its conflicts does not seem to tarnish its reputation as a leading authority on the War in Terror.”. They are able to turn this cycle of bloodshed and endless conquest into a market advantage. Israel becomes a staple globally for their “cutting edge” technological developments, all at the expense of people forced to live in what has been referred to as the “world’s largest open-air prison”.
This phenomenon is no better seen than Israel’s new AI surveillance and targeting technology dubbed “Lavender”. Lavender, as described by +972 Magazine (an online non-profit publication focused towards on the ground reporting from Palestine-Israel) is a system tasked to mark any targets it suspects of being members of HAMAS or the Palestinian Islamic Jihad (P.I.J). The software then is designed to seek out targets of all ranks, and these people are evaluated as potential bombing targets, often assigning them a number of permitted civilian casualties (Abraham, 2024). However, an incredibly important qualification to make is that this is not a list of confirmed militants or operatives. Lavender is a system that cross references intelligence sources and in doing so creates a list of who could be a potential target (McKernan & Davies, 2024). In the early weeks of the war, the Israeli army had authorized even alleged junior members of HAMAS or the PI.J to be taken down with a 15-20 civilian casualty ratio. Furthermore, there have been a number of cases wherein upwards of 100 civilians were murdered simply for the goal of assassinating a single commander (Abraham, 2024). One of Lavender’s secondary elements is a program called “Where’s Your Daddy?”. This software is designed to identify where the targets live and set their private homes as a marker, generally for when they are most vulnerable. Lavender then uses this data and sends alerts in real time for when targets are home, and bombings would be most optimal. “It’s much easier to bomb a family’s home. The system is built to look for them in these situations.” (McKernan & Davies, 2024). Where’s Your Daddy tracks the tens of thousands of individuals deemed as targets by Lavender, so that Israel can simultaneously link every target to their respective residences, creating a web across Gaza and the West Bank of these hypothetically valid targets (Abraham, 2024). The bombings are generally carried out by what Israel refers to as ‘dumb bombs’, a term denoting unguided missile. The devastation from these bombings was trackable in real time during the first 45 days of the war and families were often the victims of these reckless and negligent attacks by virtue of the policy allowing a certain number of casualties per target. OCHA estimates that of the roughly 11,078 fatalities, 6,120 of them were from a group of the same 1,340 families (OCHA, Day 45, 2023). These software programs create and foster an often imprecise and indiscriminate method of warfare, one that knows no bounds nor cares for them. Israeli military and government officials are acutely aware of this, the following is an Israeli intelligence officer describing his experience using Lavender: “The machine did it coldly. And that made it easier.” (McKernan & Davies, 2024).
These weapons are not new. Israel has a history of testing both AI and traditional kinetic weaponry in the O.P.T. For instance, Israel has long had AI robot mounted guns that are able to track people in the West Bank and fire on the command of Israeli soldiers manning the machines potentially hundreds of kilometers away. The machines are omnipresent, often placed in high vantage points and Israel claims that they are used in order to ensure precision, but these machines are typically used to quell any acts of defiance (Min, 2022). Whatever an Israeli soldier may see as defiance, is entirely subjective. Additionally, Israel has long incorporated facial recognition scans into their checkpoints, located all over the O.P.T and even within Israel proper. The biometric data is kept and categorized as a means of tracking the movements of Palestinians throughout Gaza primarily. This data is used in tandem with Lavender, which frequently leads to false positives when determining who may or may not be connected to HAMAS or the P.I.J. It bears repeating that this software is unable to determine if the people scanned are combatants or non-combatants. This has given way to further civilian deaths, arrests and detainments despite the target being “positively” identified by Lavender (Adamson, 2025). These systems, when paired with the cameras, checkpoints and soldiers stationed quite literally everywhere in the O.P.T creates a panopticon-style prison in which Palestinians are the guinea pigs and are subject to the whims of every military official or company in need of test subjects for new military technology, in effect removing any real rights to privacy or agency. They are stripped systematically of their dignity and humanity, seen only as a means to an end and if they are to show resistance, they are labelled terrorists and arrested on any manner of “charges” Israel can claim while arrested and held without trial or due process, never informed as to what they did wrong (Mandraud, 2025).
In the wake of this technology, there are far more questions left than answers. Broadly speaking, the emerging AI industry across the globe is poorly regulated and has resulted in more harm than good. Even at the lowest levels, AI and the datacenters needed to run it are consuming levels of clean water at rates that almost ensure that half the world’s population will be left with scarce levels of clean drinking water by 2030 (Gordon, 2024). When accounting for the military sector, this problem balloons out of control. International law did not account for such drastic technological developments that would leave civilians and combatants alike without answers. Much like chemical warfare in the second world war, the use of AI cyberweaponry is becoming increasingly difficult to navigate in terms of legality and its application in military or security contexts. Due to its currently unregulated position, AI and its use militarily has been without restriction often leaving those in its crosshairs helpless, combatant or not. The Palestinian struggle is emblematic of this problem. The Guardian, upon contacting Sarah Harrison (former lawyer for the United States Department of Defense) was told that the application of Lavender in threat detection, target designation and determination of “permissible” civilian casualties is not uniform. Proportionality fluctuates and the choices made by Israel, at minimum, require more caution paid to them in determining “justifiable” civilian casualties (McKernan & Davies, 2024). The international community, over the last century, has consistently let Palestinians suffer. From 1917, 1948, 1967, 1987, 2000, 2014, 2022 and now 2023, there has been no respite for Palestinians, no moment of peace. The end of the second world war marked a shift, the world ostensibly agreed to put an end to the undue suffering of people with the creation of international agencies like the United Nations and the establishment of international law as guiding mechanisms and principles. Now, Palestinians are left to fend for themselves in what has been referred to as the “world’s largest open-air prison”, trapped while Israel continues to experiment on them. In a time where the international community is needed most, no one seems to have the ability to address one of the greatest humanitarian abuses in the post war era. Genocide or the “crime of crimes” as it is referred to is, being carried out right with the help of those who purport themselves as bastions of liberal-democratic values, those who hold in high regard the sanctity of human rights and institutions like the United Nations (Amnesty International, 2024). Yet still, in all of this, for Palestinians, there is another set of human rights, another conception of dignity and fundamentally another meaning for what it means to be human.
References
Abraham, Y. (2024, April 3). “Lavender”: the AI Machine Directing Israel’s Bombing Spree in Gaza.
+972 Magazine. https://www.972mag.com/lavender-ai-israeli-army-gaza/
Adamson, L. (2025, October 24). Military Use of Biometrics Series – Israel’s Use of AI-DSS and
Facial Recognition Technology: The Erosion of Civilian Protection in Gaza. Lieber Institute West Point. https://lieber.westpoint.edu/israels-use-ai-dss-facial-recognition-technologyerosion-civilian-protection-gaza/
Amnesty International. (2024, December 5). Amnesty International Concludes Israel Is Committing
Genocide against Palestinians in Gaza. Amnesty International. https://www.amnesty.org/en/latest/news/2024/12/amnesty-international-concludes-israel-iscommitting-genocide-against-palestinians-in-gaza/
Gordon, C. (2024, February 25). AI Is Accelerating the Loss of Our Scarcest Natural Resource: Water. Forbes. https://www.forbes.com/sites/cindygordon/2024/02/25/ai-is-accelerating-the-loss-ofour-scarcest-natural-resource-water/
Halper, J. (2015). War Against the People Israel, the Palestinians and Global Pacification. Pluto Press.
Mandraud, I. (2025, October 10). In Israeli prisons, hunger and torture are ever-present. Le Monde.fr; Le Monde. https://www.lemonde.fr/en/international/article/2025/10/10/in-israeli-prisonshunger-and-torture-are-ever-present_6746283_4.html
McKernan, B., & Davies, H. (2024, April 3). “The machine did it coldly”: Israel used AI to identify 37,000 Hamas targets. The Guardian. https://www.theguardian.com/world/2024/apr/03/israelgaza-ai-database-hamas-airstrikes
Min, R. (2022, October 17). AI-powered guns being deployed by the Israeli army in the West bank. Euronews. https://www.euronews.com/next/2022/10/17/israel-deploys-ai-powered-robot-gunsthat-can-track-targets-in-the-west-bank
United Nations Office for the Coordination of Humanitarian Affairs - occupied Palestinian territory |
Hostilities in the Gaza Strip and Israel - reported impact | Day 45. (2023, November 20).
United Nations Office for the Coordination of Humanitarian Affairs - Occupied Palestinian Territory. https://www.ochaopt.org/content/hostilities-gaza-strip-and-israel-reported-impactday-45
