Are Israeli maps and AI really not saving Palestinian lives? | Israel-Palestine conflict

//

News Team

The Israeli army’s Arabic-language spokesperson, Avichay Adraee, recently posted a map of Gaza, divided into numbered blocks, with instructions for Palestinians to evacuate to Rafah. Leaflets containing a QR code linking to the map on the Israeli army’s website were also dropped over Gaza. At the same time, Israeli fighter jets bombarded the south of the Strip, killing hundreds of Palestinians in 24 hours. The Israeli army proudly announced that it had hit “400 targets”. Media reports revealed that the Israeli army’s ability to intensify what it calls “precision” air strikes has been boosted by an artificial intelligence (AI) tool that generates “targets”.

The maps, leaflets, tweets, and claims of “precision” military technology all contribute to the narrative that Israel’s “most moral army” is taking care to protect civilians in Gaza. However, these are no more than a propaganda ploy to cover up what is really happening on the ground – an AI-assisted genocide.

Over the past two months of brutal war, Israel has constantly resorted to the use of “evacuation” maps and warnings issued on social media, calling on Palestinians to flee certain areas of Gaza. Yet the mounting death toll offers no evidence that Israel is concerned about the wellbeing of Palestinian civilians. What it is concerned about is the growing condemnations abroad of what legal experts are calling genocide and increasing pressure from the United States.

The “evacuation messaging” the Israeli army has been undertaking is more directed at Western audiences, seeking to assuage their fears about the civilian death toll, than the Palestinians in Gaza. The fact that it is delivered mostly on social media platforms indicates the intended audience is not the people in the Strip. The Israeli army has not only cut off electricity to Gaza but also targeted and damaged its already temperamental mobile network, thus leaving most of the people there with almost no access to the internet.

Discrepancies of different maps being shared by Israeli officials have also resulted in additional confusion. Areas marked for targeting in orange did not even correspond with the numbers of blocks officials were telling people to evacuate from. Consequently, the overall impact of the maps has been to create “fear, panic and confusion”.

Gaza is 360 square kilometres and has a population of 2.3 million. The average size of each of the 620 blocks on the map is 0.58 square kilometres, which means approximately 3,700 residents per block. Asking dozens of blocks equating to tens of thousands of people to move is hardly “precision”. It is mass displacement masquerading as parsimonious precaution.

Apart from using digital maps and QR codes to try and prove to its allies that its army is not reckless, Israel is also boasting about its “precision” military technologies. Among them is an AI weapons system called “Habsora” (“The Gospel”) which can quickly and automatically identify targets, much faster than older methods. If in previous bombing campaigns, the Israeli army would manually select 50 targets per day, today the new system provides 100.

According to one source quoted by the +972 magazine, this weapon has turned the Israeli army into a “mass assassination factory”, focusing more on the “quantity and not quality”. The magazine reports that the Israeli soldiers using the AI targeting system are aware of the number of civilians they will kill; it is displayed in the category “collateral damage” in the target file.

The Israeli army has categorised thresholds of civilian deaths, ranging from five to the hundreds. The directive “collateral damage five”, for example, means the Israeli soldiers are authorised to kill a target that will also kill 5 civilians. On the higher end, “the Israeli military command knowingly approved the killing of hundreds of Palestinian civilians in an attempt to assassinate a single top Hamas military commander”.

Given that Israel considers all 30,000 Hamas members in Gaza as potential targets, this means that “wiping out” the movement would entail a massive civilian death toll. If we use the lowest “collateral damage five”, the most conservative estimate amounts to 150,000 civilians. Another disturbing element of AI is that it reproduces biases it has been trained on. Historically, Israel has shown little regard for civilian life in its bombing. One has to wonder to what extent the secretive AI has learned to associate any Palestinian with “Hamas terrorist” based on past Israeli army behaviour. This might explain why it is able to generate so many new “targets” for bombing.

Israel likes to boast about its morality and high-tech, precision strike capabilities, ironically as a means of defending itself against claims of indiscriminately attacking civilians and allegations of war crimes. This characterisation of the Israeli army’s technological sophistication is also used by the US to help justify its support for Israel. Blinken, for example, has stated that “Israel has … one of the most sophisticated militaries in the world. It is capable of neutralising the threat posed by Hamas while minimising harm to innocent men, women and children.”

But the more the US and Israel promote the narrative of its technological prowess, the more it gives an element of legal jeopardy. As international law professor Michael Schmitt argues, “the greater the precision capabilities of an attacker, the more compelling the characterisation of an attack striking civilians or civilian objects as reckless”. In other words, a high-tech army has more of an obligation to try and “prove” that they are not being reckless. The only answer is that Israel has precision weapons, but is still targeting people indiscriminately. Thus, sophisticated technology, rather than serving its ostensible purpose of precision and precaution, is instead weaponised as a tool of mass killing and destruction. In other words, what we are seeing in Gaza is an AI-assisted genocide.

World, Politics, Technology

Leave a Comment