December 22, 2024
'Automated Assassination': Israel Lets AI Decide Who Dies In Gaza

Authored by Will Porter via The Libertarian Institute, 

The Israeli military has employed yet another AI-based system to select bombing targets in the Gaza Strip, an investigation by +972 Magazine has revealed. The new system has generated sweeping kill lists condemning tens of thousands of Palestinians, part of the IDF’s growing dependence on AI to plan lethal strikes.

Citing six Israeli intelligence officers, the Tel Aviv-based magazine said the previously undisclosed AI system, dubbed ‘Lavender,’ has played a “central role in the unprecedented bombing” of Gaza since last October, with the military effectively treating its output “as if it were a human decision.”

“Formally, the Lavender system is designed to mark all suspected operatives in the military wings of Hamas and Palestinian Islamic Jihad (PIJ), including low-ranking ones, as potential bombing targets,” the outlet reported, adding that “during the first weeks of the war, the army almost completely relied on Lavender, which clocked as many as 37,000 Palestinians as suspected militants—and their homes—for possible air strikes.”

However, while thousands have been killed in the resulting air raids, the majority were “women and children or people who were not involved in the fighting,” the officers told the magazine, noting that Israeli field commanders often rely on the AI system without consulting more substantial intelligence.

“Human personnel often served only as a ‘rubber stamp’ for the machine’s decisions,” one source said, adding that many commanders spend a mere “20 seconds” reviewing targets before approving strikes—“just to make sure the Lavender-marked target is male.”

Human input has been relegated to such a minor role in the decision-making process that Lavender’s conclusions are often treated as “an order” by Israeli troops, “with no requirement to independently check why the machine made that choice.”

Such decisions are made despite well-known system errors which result in misidentified targets in at least 10% of cases. Nonetheless, the AI has “systematically” selected the homes of suspected militants for strikes, with IDF bombings frequently carried out late at night, when entire families are more likely to be present.

In targeting lower-level Hamas fighters in the early stages of the war, the military largely resorted to the use of unguided ‘dumb bombs,’ concluding it was permissible to “kill up to 15 or 20 civilians” in such operations, the intelligence sources added. Senior militants, meanwhile, could warrant the deaths of “more than 100 civilians” in some cases.

“You don’t want to waste expensive bombs on unimportant people,” one officer said.

Automated Assassination

Lavender is far from the first AI program used to direct operations for Israel’s military. Yet another system unveiled by +972 mag, known as ‘Where’s Daddy?’, has also been used “specifically to track the targeted individuals and carry out bombings when they had entered their family’s residences.”

An unnamed intelligence officer told the outlet that homes are considered a “first option” for targeting, observing that the IDF is “not interested in killing [Hamas] operatives only when they [are] in a military building or engaged in a military activity.”

As of April, Israeli bombings have damaged or destroyed a staggering 62% of all housing units in Gaza—or nearly 300,000 homes—leaving more than 1 million people internally displaced, according to United Nations estimates. The territory’s housing sector has borne the brunt of the Israeli onslaught, representing well over two-thirds of the destruction in Gaza to date.

Earlier reporting has shed further light on Israel’s AI-driven “mass assassination factory,” with another program, ‘the Gospel,’ used to automatically generate massive target lists at a rate vastly exceeding previous methods. Under the guidance of that tool, Israeli forces have increasingly struck what they call “power targets,” including high-rise residential structures and public buildings. Such attacks are reportedly part of an effort to exert “civil pressure” on Palestinian society—a tactic clearly prohibited under international law as a form of collective punishment.

The IDF has long relied on extensive “target banks” in planning operations in Gaza and the West Bank, gathering a long list of suspected militant command posts and installations. In recent years, however, those lists have swelled to include thousands of potential targets as the military outsources decision-making to automated systems.

Adding to the litany of AI programs used to deliver death in Gaza and beyond, Israel’s ‘Fire Factory’ system helps to automatically calculate munitions payloads and assign targets to particular aircraft or drones once they are selected. “What used to take hours now takes minutes, with a few more minutes for human review,” an IDF colonel said of the system in comments to Bloomberg.

Artificial intelligence and AI-powered facial recognition tech have similarly taken a greater role in policing the border between the occupied territories and Israel proper—as well as West Bank checkpoints—with the IDF deploying a litany of new systems to identify, surveil and arrest Palestinians in recent years.

Tyler Durden Fri, 04/12/2024 - 03:30

Authored by Will Porter via The Libertarian Institute, 

The Israeli military has employed yet another AI-based system to select bombing targets in the Gaza Strip, an investigation by +972 Magazine has revealed. The new system has generated sweeping kill lists condemning tens of thousands of Palestinians, part of the IDF’s growing dependence on AI to plan lethal strikes.

Citing six Israeli intelligence officers, the Tel Aviv-based magazine said the previously undisclosed AI system, dubbed ‘Lavender,’ has played a “central role in the unprecedented bombing” of Gaza since last October, with the military effectively treating its output “as if it were a human decision.”

“Formally, the Lavender system is designed to mark all suspected operatives in the military wings of Hamas and Palestinian Islamic Jihad (PIJ), including low-ranking ones, as potential bombing targets,” the outlet reported, adding that “during the first weeks of the war, the army almost completely relied on Lavender, which clocked as many as 37,000 Palestinians as suspected militants—and their homes—for possible air strikes.”

However, while thousands have been killed in the resulting air raids, the majority were “women and children or people who were not involved in the fighting,” the officers told the magazine, noting that Israeli field commanders often rely on the AI system without consulting more substantial intelligence.

“Human personnel often served only as a ‘rubber stamp’ for the machine’s decisions,” one source said, adding that many commanders spend a mere “20 seconds” reviewing targets before approving strikes—“just to make sure the Lavender-marked target is male.”

Human input has been relegated to such a minor role in the decision-making process that Lavender’s conclusions are often treated as “an order” by Israeli troops, “with no requirement to independently check why the machine made that choice.”

Such decisions are made despite well-known system errors which result in misidentified targets in at least 10% of cases. Nonetheless, the AI has “systematically” selected the homes of suspected militants for strikes, with IDF bombings frequently carried out late at night, when entire families are more likely to be present.

In targeting lower-level Hamas fighters in the early stages of the war, the military largely resorted to the use of unguided ‘dumb bombs,’ concluding it was permissible to “kill up to 15 or 20 civilians” in such operations, the intelligence sources added. Senior militants, meanwhile, could warrant the deaths of “more than 100 civilians” in some cases.

“You don’t want to waste expensive bombs on unimportant people,” one officer said.

Automated Assassination

Lavender is far from the first AI program used to direct operations for Israel’s military. Yet another system unveiled by +972 mag, known as ‘Where’s Daddy?’, has also been used “specifically to track the targeted individuals and carry out bombings when they had entered their family’s residences.”

An unnamed intelligence officer told the outlet that homes are considered a “first option” for targeting, observing that the IDF is “not interested in killing [Hamas] operatives only when they [are] in a military building or engaged in a military activity.”

As of April, Israeli bombings have damaged or destroyed a staggering 62% of all housing units in Gaza—or nearly 300,000 homes—leaving more than 1 million people internally displaced, according to United Nations estimates. The territory’s housing sector has borne the brunt of the Israeli onslaught, representing well over two-thirds of the destruction in Gaza to date.

Earlier reporting has shed further light on Israel’s AI-driven “mass assassination factory,” with another program, ‘the Gospel,’ used to automatically generate massive target lists at a rate vastly exceeding previous methods. Under the guidance of that tool, Israeli forces have increasingly struck what they call “power targets,” including high-rise residential structures and public buildings. Such attacks are reportedly part of an effort to exert “civil pressure” on Palestinian society—a tactic clearly prohibited under international law as a form of collective punishment.

The IDF has long relied on extensive “target banks” in planning operations in Gaza and the West Bank, gathering a long list of suspected militant command posts and installations. In recent years, however, those lists have swelled to include thousands of potential targets as the military outsources decision-making to automated systems.

Adding to the litany of AI programs used to deliver death in Gaza and beyond, Israel’s ‘Fire Factory’ system helps to automatically calculate munitions payloads and assign targets to particular aircraft or drones once they are selected. “What used to take hours now takes minutes, with a few more minutes for human review,” an IDF colonel said of the system in comments to Bloomberg.

Artificial intelligence and AI-powered facial recognition tech have similarly taken a greater role in policing the border between the occupied territories and Israel proper—as well as West Bank checkpoints—with the IDF deploying a litany of new systems to identify, surveil and arrest Palestinians in recent years.

Loading…