Israel’s Habsora AI system using machine learning algorithms to pick bombing targets and precalculate likely civilian deaths.
By Bianca Baggiarini, ASIA TIMES 10 December 2023
In principle, machine learning systems may enable more precisely targeted attacks and fewer civilian casualties. Photo: AP
Last week, reports emerged that the Israel Defense Forces (IDF) are using an artificial intelligence (AI) system called Habsora (Hebrew for “The Gospel”) to select targets in the war on Hamas in Gaza. The system has reportedly been used to find more targets for bombing, to link locations to Hamas operatives and to estimate likely numbers of civilian deaths in advance.What does it mean for AI targeting systems like this to be used in conflict? My research into the social, political and ethical implications of military use of remote and autonomous systems shows AI is already altering the character of war.
Militaries use remote and autonomous systems as “force multipliers” to increase the impact of their troops and protect their soldiers’ lives. AI systems can make soldiers more efficient, and are likely to enhance the speed and lethality of warfare – even as humans become less visible on the battlefield, instead gathering intelligence and targeting from afar.When militaries can kill at will, with little risk to their own soldiers, will the current ethical thinking about war prevail? Or will the increasing use of AI also increase the dehumanization of adversaries and the disconnect between wars and the societies in whose names they are fought?
AI is having an impact at all levels of war, from “intelligence, surveillance and reconnaissance” support, like the IDF’s Habsora system, through to “lethal autonomous weapons systems” that can choose and attack targets without human intervention.
Gosh, I certainly hope so, in this case. First positive thing I’ve heard about AI. 😀
Whether in Buddhism, Judaism, or Christianity, I can’t fathom this concept of blessing our enemies. Stupid in my opinion.
This is scary as hell.