They were going to kill these people whether an AI was involved or not, but it certainly makes it a lot easier to make a decision when you're just signing off on a decision someone else made. The level of abstraction made certain choices easier. After all, if the system is known to be occasionally wrong and everyone seems to know it yet you're still using it, is that not some kind of implicit acceptance?
One source stated that human personnel often served only as a “rubber stamp” for the machine’s decisions, adding that, normally, they would personally devote only about “20 seconds” to each target before authorizing a bombing — just to make sure the Lavender-marked target is male. This was despite knowing that the system makes what are regarded as “errors” in approximately 10 percent of cases, and is known to occasionally mark individuals who have merely a loose connection to militant groups, or no connection at all.
It also doesn't surprise me that when you've demonized the opposition, it becomes a lot easier to just be okay with "casualties" which have nothing to do with your war. How many problematic fathers out there are practically disowned by their children for their shitty beliefs? Even if there were none, it still doesn't justify killing someone at home because it's 'easier'
Moreover, the Israeli army systematically attacked the targeted individuals while they were in their homes — usually at night while their whole families were present — rather than during the course of military activity. According to the sources, this was because, from what they regarded as an intelligence standpoint, it was easier to locate the individuals in their private houses. Additional automated systems, including one called “Where’s Daddy?” also revealed here for the first time, were used specifically to track the targeted individuals and carry out bombings when they had entered their family’s residences.
All in all this is great investigative reporting, and it's absolutely tragic that this kind of shit is happening in the world. This piece isn't needed to recognize that a genocide is happening and it shouldn't detract from the genocide in any way.
As an aside, I also help it might get people to wake up and realize we need to regulate AI more. Not that regulation will probably ever stop the military from using AI, but this kind of use should really highlight the potential dangers.