Above photo: Gazan street during Operation Swords of Iron. Yairfridman2003 / Wikimedia Commons.
Technological change, while it helps humanity meet the challenges nature imposes upon us, leads to a paradigm shift: It leaves us less capable, not more, of using our intellectual capacities. It diminishes our minds in the long run. We strive to improve ourselves while risking a regression to the Stone Age if our ever more complex, ever more fragile technological infrastructure collapses.
That is Hans Köchler, an eminent Viennese scholar and president of the International Progress Organization, a globally active think tank, addressing an audience here last Thursday evening, April 4. The date is significant: The day before Köchler spoke, +972 Magazine and Local Call, independent publications in Israel–Palestine, reported that as the Israel Defense Forces press their savage invasion of the Gaza Strip, they deploy an artificial intelligence program called Lavender that so far has marked some 37,000 Palestinians as kill targets. In the early weeks of the Israeli siege, according to the Israeli sources +972 cites, “the army gave sweeping approval for officers to adopt Lavender’s kill lists, with no requirement to thoroughly check why the machine made those choices or to examine the raw intelligence data on which they were based.”
Chilling it was to hear Köchler speak a couple of news cycles after +972 published these revelations, which are based on confidential interviews with six Israeli intelligence officers who have been directly involved in the use of AI to target Palestinians for assassination. “To use technologies to solve all our problems reduces our ability to make decisions,” Köchler asserted. “We’re no longer able to think through problems. They remove us from real life.”
Köchler titled his talk “The Trivialization of Public Space,” and his topic, broadly stated, was the impact of technologies such as digital communications and AI on our brains, our conduct, and altogether our humanity. It was sobering, to put the point mildly, to recognize that Israel’s siege of Gaza, bottomlessly depraved in itself, is an in-our-faces display of the dehumanizing effects these technologies have on all who depend on them.
Let us look on in horror, and let us see our future in it.
We see in the IDF, to make this point another way, a rupture in morality, human intelligence, and responsibility when human oversight is mediated by the algorithms that run AI systems. There is a break between causality and result, action and consequence. And this is exactly what advanced technologies have in store for the rest of humanity. Artificial intelligence, as Köchler put it, is not intelligence: “It is ‘simulated intelligence’ because it has no consciousness of itself.” It isn’t capable, he meant to say, of moral decision-making or ethical accountability.
In the Lavender case, the data it produced were accepted and treated as if they had been generated by a human being without any actual human oversight or independent verification. A second AI system, sadistically named “Where’s Daddy?”—and how sick is this?—was then used to track Hamas suspects to their homes. The IDF intentionally targeted suspected militants while they were with their families, using unguided missiles or “dumb” bombs. This strategy had the advantage of enabling Israel to preserve its more expensive precision-guided weapons, or “smart” bombs.
As one of +972’s sources told the magazine:
We were not interested in killing [Hamas] operatives only when they were in a military building or engaged in a military activity… . On the contrary, the IDF bombed them in homes without hesitation, as a first option. It’s much easier to bomb a family’s home. The system is built to look for them in these situations.
Once Lavender identified a potential suspect, IDF operatives had about 20 seconds to verify that the target was a male before making the decision to strike. There was no other human analysis of the “raw intelligence data.” The information generated by Lavender was treated as if it was “an order,” sources told +972—an official order to kill. Given the strategy of targeting suspects in their homes, the IDF assigned acceptable kill ratios for its bombing campaigns: 20 to 30 civilians for each junior-level Hamas operative. For Hamas leaders with the rank of battalion or brigade commander, +972’s sources said, “the army on several occasions authorized the killing of more than 100 civilians in the assassination of a single commander.”
In other words, Israeli policy, guided and assisted by AI technology, made it inevitable that thousands of civilians, many of them women and children, would be killed.
There appears to be no record of any other military deploying AI programs such as Lavender and Where’s Daddy? But it is sheer naïveté to assume this diabolic use of advanced technologies will not spread elsewhere. Israel is already the world’s leading exporter of surveillance and digital forensic tools. Anadolu, Turkey’s state-run news agency, reported as far back as February that Israel is using Gaza as a weapons-testing site so that it can market these tools as battle-tested. Antony Lowenstein, an author Anadolu quotes, calls this the marketing of “automated murder.”
And here we find ourselves: Haaretz, the Israeli daily, reported on April 5 that “intelligent” weapons proven effective in Gaza were major attractions when Israel marketed them last month at the Singapore Airshow, East Asia’s biggest arms bazaar.
Hans Köchler, who has studied the impact of digital technologies for many years, did not seem to have read the +972 Magazine report before he spoke here last week. This made his remarks all the more disturbing. He was not describing—not specifically—the murderers operating Lavender and other such technologies in Gaza. We will all live and die by these Faustian technologies: This, our common fate, was Köchler’s topic. Over the past six months, this is to say, Israel has announced the dehumanization that awaits all of us in that AI systems are technologies against which we have little defense. “Self-determination gives way to digital competence,” Köchler said. “We can’t distinguish between virtual reality and reality.”
Along with the +972 report on the use of AI came others in a week notable for its stomach-churning news of Israeli depravity. In its April 3 editions The Guardian revealed that the IDF intentionally deploys snipers and quadcopters—remotely controlled sniper drones—to target children. The evidence of this comes from U.S. and Canadian doctors who, while serving in Gaza, treat many children with wounds consistent with and easily identified as caused by snipers’ bullets. These are larger than the ammunition generally used in combat because they are intended to kill rather than wound.
The Biden regime never addresses these barbaric developments, and our corporate media, with rare exceptions such as The Guardian piece just cited, tell us almost nothing of them. Official and media accounts of events in Gaza, their “narratives,” are utterly at odds with these realities. How, we are left to ask, do they get away with these day-in, day-out dishonesties? This was the obvious question last week, given the extremes to which the IDF’s criminality now extends.
If you Google “Lavender” and “The New York Times,” you get “Lavender Oil Might Help You Sleep” and similarly frivolous headlines. Neither has The Times made any mention of the +972 investigation. If you read detailed accounts of the April 1 air attacks on the World Central Kitchen’s three food-delivery vehicles, which killed seven aid workers, it is inescapable that the Israeli military systematically targeted them, one truck to the next, until all three were destroyed—this after WCK had carefully coordinated its deployment of the vehicles with Israeli authorities. These killings are entirely in line with the directive Yoav Gallant, Israel’s repulsive defense minister, issued Oct. 9: “There will be no electricity, no food, no water, no fuel, everything will be closed.”
And what did we read of this incident in mainstream media?
Per usual, the Israeli military was authorized to investigate the Israeli military—an absurdity no U.S. official and no media account questioned. On April 5 the IDF announced that two officers were dismissed and three other reprimanded for “mishandling critical information.” President Biden declared he was “heartbroken.” The New York Times called the attack “a botched operation,” explaining that the IDF’s top officers “were forced to admit to a string of lethal mistakes and misjudgments.” Over and over we hear the refrain that Israel “is not doing enough to protect civilians.”
So it was a regrettable accident, we are led to conclude. Israel is doing its best. It has all along done its best. Put this against the raw statistic: The IDF has killed more than 220 humanitarian workers since it began its siege last October, to go by the U.N.’s count. How can one possibly believe that these were 220–plus accidents? “Let’s be very clear. This is not an anomaly,” an Oxfam official, Scott Paul, said after the WCK attack. “The killing of aid workers in Gaza has been systemic.”
There is reality and there is meta-reality, a term I have used previously in this space. How do the two stand side-by-side? How does the latter, the conjured “reality,” prove so efficacious? How do so many accept the 220–plus-accidents “narrative?” Why, more broadly, do so many accept propaganda and lies when they know, subliminally, they are constantly fed lies and propagandized?
I go back once again to Hans Köchler. In his speech and in various of his many books, he argues that electronic media—television chief among these—have conditioned people to rely for information on pictures and images instead of reading. “They lose the ability to analyze text, and so the ability to understand problems,” he said here. “People come to live in virtual worlds.”
We cannot think of a better description of the “narratives” advanced by the Biden regime and disseminated in corporate media: They present us with a virtual world—fully aware that, our minds habituated to pictures and images, most of us will mistake this virtual world for reality, just as Köchler warns. As a member of the audience here put it, “How is it possible to watch a genocide in real time and no one says anything? Knowledge no longer has any value. Anything goes, and if anything goes, nothing goes.”
The Biden regime supplies Israel with weaponry to prosecute its criminal siege of Gaza’s 2.3 million Palestinians. It gives the apartheid state diplomatic cover at the United Nations and legal cover at the International Court of Justice. It distorts and obscures the IDF’s “Stone Age” conduct. All of this requires us to speak now not of Israel’s genocide but of the Israeli–U.S. genocide.
But the Biden regime is culpable in inflicting these multiple wounds on humanity in one other dimension we must not miss. With its incessant attempts to suspend us in a virtual reality of its making, distant from what it is doing in our names, it leads us into the dehumanized, grotesquely technologized future Köchler describes just as surely as the Israelis do as they murder human beings wholesale with AI weapons and kill innocent children with remotely controlled sniper drones.