- Six unnamed Israeli intelligence officials admitted to using an artificial intelligence (AI) called “Lavender” in picking up their targets in the ongoing conflict in Gaza.
- The AI tool allegedly has a 10% error rate.
The Lavender AI Program
According to +972 Magazine and Local Call, a publication jointly run by Palestinian and Israeli journalists, six anonymous members of the Israeli intelligence community allegedly confirmed the utilization of artificial intelligence in their arsenal. The AI tool codenamed “Lavender” was used to select bombing targets in conflict-ridden Gaza.
CNN claimed to have confirmed the application of the technology in its separate inquiry on the Israel Defence Forces (IDF). However, the military denied relying entirely on the AI tool as it was known to have a 10% error threshold.
In its official response to the news outlet, IDF explained that Lavender was merely used for analysis of the identification of the targets. They added that they are making extensive efforts to reduce collateral damage to civilians in any way possible during the circumstances of the strikes. Hence, they assured that there are analysts in place to conduct their own assessments and verification of the targets identified by the program in compliance with international laws and the restrictions stipulated in the IDF directives.
This apparently conflicted with the initial reports alleging that the human personnel just served as a “rubber stamp” for the computer’s decisions. There was also a narrow window between each target, which normally took merely 20 seconds before the execution of each authorization. The +972 article, citing the statement of an unnamed IDF official, said that this thin time frame normally allowed the human monitor to only ensure that the marks were male.
The chilling admission of Lavender’s existence comes amid the global outcry to the aggressive siege of Gaza by the Israeli forces. Based on the statistics of the Gaza Ministry of Health, the death toll has already reached 32,916 since the start of the conflict last year. Meanwhile, a United Nations investigation found that around 75% of the region’s population is suffering from extreme levels of hunger.
AI Regulation
The overreliance on AI in human conflicts like the one in Gaza certainly raises a lot of questions about the ethical and legal implications of the practice. If there’s something that can be learned from this chilling report is that war is changing, which calls for a re-evaluation or amendment of outdated international rules such as the Geneva Protocols concerning the use of existing technologies like AI in wars.