Sources: Israel's bombing campaign in Gaza used Lavender, an AI system that identified 37,000 potential human targets based on their apparent links to Hamas
Israeli intelligence sources reveal use of ‘Lavender’ system in Gaza war and claim permission given to kill civilians in pursuit of low-ranking militants
The Guardian
Related Coverage
- ‘Lavender’: The AI machine directing Israel's bombing spree in Gaza +972 Magazine · Yuval Abraham
- This AI is helping Israel in Gaza conflict and is more brutal than humans Business Today
- Israeli Military Using AI to Select Targets in Gaza With ‘Rubber Stamp’ From Human Operator: Report Gizmodo · Matt Novak
- Israel's secret ‘Lavender’ AI used for Gaza kill lists, report claims The i Paper · Kieron Monks
- IDF Allowed 100 Civilian Deaths for Every Hamas Official Targeted by Error-Prone AI System ScheerPost
- IDF denies it uses AI software to target individuals in Gaza bombing campaigns SiliconANGLE · James Farrell
- How Does AI Tech Influence Military Decision-Making? The Harsh Realities of the Israel-Palestine War Cryptopolitan · Aamir Sheikh
- Israeli military's use of AI to generate targets under spotlight after aid workers' killing in Gaza FRANCE 24 English on YouTube
- Israeli ‘AI secret weapon dubbed Lavender’ is revealed after ‘coldly identifying 37,000 Hamas targets to strike’ The Sun · Jerome Starkey
- IDF denies report that it's using AI to build list of 37,000 targets based on Hamas ties The Times of Israel · Jacob Magid
- Gaza Conflict: Israel used AI to strike thousands of Hamas targets Firstpost
- The Guardian: Israel used A.I. to target 37,000 people in Gaza. Daily Kos
- Israel Defence Forces' response to claims about use of ‘Lavender’ AI database in Gaza The Guardian
- “Israel's use of powerful AI systems in its war on Hamas has entered uncharted territory for advanced warfare, raising a host of legal and moral questions, and transforming the relationship between military personnel and machines. … @remixtures@tldr.nettime.org · Miguel Afonso Caetano
- Additional automated systems, including one called “Where's Daddy?” also revealed here for the first time, were used specifically to track the targeted individuals and carry out bombings when they had entered their family's residences. — https://www.972mag.com/... @Khrys@mamot.fr
- Just a caution - 972's lavender scoop could still be used to shield political and military commanders from accountability. I would hate to see only devs and dogtags in the dock because they built the machine that directed the death. … @anilmc@hachyderm.io
- This is twice in a week that I've been shocked more than the Snowden docs: — https://www.972mag.com/... This is contact chaining from then, but with loosy goosy statistics and a program called “Where's Daddy?” (more horrible than you can imagine). — All this was horrifying a decade ago and it has gotten worse. @seriouslyjeff@social.jeffl.es · Jeff Larson
- AI tech proponents are not asking clearly enough why these tools are being developed, what world they are trying to create. The Palestine / Lavender thing is the world they are trying to create: doing the same bad shit, but faster and worse, with less human compassion @makingarecord@friend.camp
- Like probably almost everyone else on here I'm just sitting in shock after reading the +972 piece on Lavender — It's not like I believed we'd never get here but I thought we were probably at least a couple years out from machine-learning-as-warcrimes- automator @left_adjoint@tilde.zone · Clarissa
- jesus h fucking christ — #lavender - the military targeting AI Israel's been using to identify suspected operatives, including “low level” ones, and bomb them /in their homes with their families/. — god almighty the previous system was called “The Gospel”??!?! — https://www.972mag.com/... @risottobias@tech.lgbt
- while we're talking about so-called Israel using AI to decide who to murder; i'd like to remind people that this project wasn't always called “Lavender”, it used to be called “The Gospel” ("Habsora" in Hebrew) and i can't tell you how sickening i find that @jesopo@chaos.social
- Lavender has played a central role in the unprecedented bombing of Palestinians, especially during the early stages of the war. In fact, according to the sources, its influence on the military's operations was such that they essentially treated the outputs of the AI machine “as if it were a human decision.” … @Khrys@mamot.fr
- There's a lot of talk on AI here on Mastodon. You think it'll affect your work. Other people lose their lives to it. — Is everyone following the story of Israel's AI-powered killing machinery for Palestinians? It literally doesn't get any more dystopian than that. … @yanone@typo.social
- “The result [of the deployment of Lavender], as the sources testified, is that thousands of Palestinians — most of them women and children … Ana Brandusescu
- “During the early stages of the war, the army gave sweeping approval for officers to adopt Lavender's kill lists, with no requirement to thoroughly check … Sahar Aziz
- Israel's ‘Lavender’ AI used for Gaza kill lists, report claims Hacker News
- ‘Lavender’: The AI machine directing Israel's bombing in Gaza Hacker News
Discussion
-
@technicallymims
Christopher Mims
on threads
New investigation says Israel is using an AI-based program to rapidly target its bombs “according to the sources, its influence on the military's operations was such that they essentially treated the outputs of the AI machine “as if it were a human decision.” https://www.972mag.c…
-
@JapanProf@mastodon.social
@JapanProf@mastodon.social
on mastodon
I refuse to live in a dystopian world where AI constantly keeps an eye on all of us, drones circle above our heads and #lavender takes us out along with our friends and family at any moment. #Gazans have been living in that dystopia. It's already their reality. …
-
@ErikJonker@mastodon.social
Erik Jonker
on mastodon
Hair-raising article about the operational use of “AI” in Gaza by the Israeli army. Terrible and must-read. — https://www.972mag.com/... #Gaza #Israel #AI #Hamas #Lavender
-
@shentonfreude@mastodon.online
@shentonfreude@mastodon.online
on mastodon
The scope of the wonton brutality in this is unbelievable, more like dystopian sci fi. ‘Lavender’ #AI automates killing with no checks on veracity. Chilling. “for every junior Hamas operative that Lavender marked, it was permissible to kill up to 15 or 20 civilians …
-
@bpettis@mastodon.benpettis.ninja
Ben Pettis
on mastodon
holy fuck — “The result, the sources testify, was that the role of human personnel in incriminating Palestinians as military operatives was pushed aside, and AI did most of the work instead.” — “The sources said they did not know how many civilians were actually killed in eac…
-
@reedmideke@mastodon.social
Reed Mideke
on mastodon
The “AI” sound a lot like the ad targeting tech that lets you build “lookalikes” from your existing customer base: “it is fed data about existing Hamas operatives, it learns to notice their features, and then it rates other Palestinians based on how similar they are to the milita…
-
@65dBnoise@mastodon.social
@65dBnoise@mastodon.social
on mastodon
“I would invest 20 seconds for each target at this stage, and do dozens of them every day. I had zero added-value as a human, apart from being a stamp of approval. It saved a lot of time.” — 20 seconds to rubber-stamp the execution of a human. Thousands of them children and …
-
@reedmideke@mastodon.social
Reed Mideke
on mastodon
Also more confirmation that any fighting age male is de-facto assumed to be a fighter: “sources said that the only human supervision protocol in place before bombing the houses of suspected “junior” militants marked by Lavender was to conduct a single check: ensuring that the AI-…
-
@Mer__edith@mastodon.world
Meredith Whittaker
on mastodon
I have a lot more to say, but I'll hold it for now and simply wonder aloud... Which BigTech clouds are the “Lavender” & “Where's Daddy?” AI systems running on? What APIs are they using? Which libraries are they calling? — What work did my former colleagues, did I, did *you*…
-
@saddestrobots.bsky.social
Alex P.
on bluesky
back of the envelope time! — 37,000 targeted for execution by bombing — if IDF says it's ok to kill 15 to 20 civilians per attack, that's up to 750,000 dead if they bomb them all — the entire population of gaza is about 2,500,000 people [embedded post]
-
@nickkristof
Nicholas Kristof
on x
An extraordinary piece in Israel's @972mag reporting on the use by Israeli forces of deeply flawed AI processes to choose targets in Gaza and then destroy them with dumb bombs that pretty much guaranteed enormous numbers of civilian casualties.
-
@psychicyogamat
Bernard Keenan
on x
How to ‘proportionately’ kill everyone. Contact chaining, indexing the value of a target to the cost of a bomb, 20+ ‘collateral’ deaths per machine-generated target, targets produced to meet demand, an algorithmic genocide. https://www.theguardian.com/ ...
-
@mehdirhasan
Mehdi Hasan
on x
“Because we usually carried out the attacks with dumb bombs, & that meant literally dropping the whole house on its occupants. But even if an attack is averted, you don't care - you immediately move on to the next target. You have another 36,000 waiting.” https://www.theguardian.…
-
@btselem
@btselem
on x
Another striking part of the investigation deals with the extremely lenient policy re collateral Damage: “Every person who wore a Hamas uniform in the past year or two could be bombed with 20 [civilians killed as] collateral damage, even without special permission,” A. continued.
-
@paulbernaluk
Prof Paul Bernal
on x
Just to be clear: AI isn't going to *create* disasters, it's going to enable bad people to create disasters. And already is.
-
@dirkmoses
Dirk Moses
on x
The Where's Daddy? program that allowed missile operators to wait till targets arrived at home (where it easier to kill them) entailed killing their families at a 15-20 to 1 ratio, far above usually accepted proportionality principles. 1/
-
@spectrathegame
@spectrathegame
on x
Really explosive and horrific beyond words- you should read this article [image]
-
@pixelatedboat
@pixelatedboat
on x
When they say civilians die because Hamas is hiding behind civilians what they mean is Hamas members are living in houses with their families [image]
-
@mitchprothero
Mitchell Prothero
on x
I don't have a news reporting job _DMs are open_ so I'll just tweet that three NATO/EU intelligence/security officials confirm this story as ‘clearly the case’ on background. and they've been tracking/monitoring data sets for months. https://www.972mag.com/...
-
@bonzerbarry
@bonzerbarry
on x
This is the end result of Israel's monstrous cyber program. “Cyber weapons have changed international relations more profoundly than any advance since the advent of the atomic bomb” - Ronen Bergman and Mark Mazzetti for the NYT. [image]
-
@ugarles
@ugarles
on x
This is the real promise of AI: displaced accountability for self-interested decisions that have negative consequences for others.
-
@hamzamsyed
Hamza M Syed
on x
Returning to this Intercept article about Google's role in Israel's AI programme; in light of “Lavender” and “The Gospel” https://theintercept.com/... [image]
-
@provisionalidea
James Rosen-Birch
on x
Reminder that even pieces published by +972 must be approved by the Israeli military censor, and must be read with the same critical eye. (quote from 2016) [image]
-
@masadfrost
Faris Masad
on x
Israel probably has pretty good AI, they don't have an accuracy problem. If they use it, they do for obfuscation/PR. They use massive bombs that destroy whole blocks and neighborhoods, or specifically target healthcare and aid facilities and workers, total civilian infra collapse
-
@bcmerchant
Brian Merchant
on x
The AI is not terrifying because it's too powerful, but because it lets operatives defer responsibility to the system, and lets leaders use it to justify nearly any level of violence they already desired to undertake—AI is terrifying because it's an enabler *of* the powerful.
-
@arictoler
Aric Toler
on x
https://www.972mag.com/... [image]
-
@brian_castner
Brian Castner
on x
This reporting would sound alarmist and hyperbolic except that this tracks exactly with the cases we've investigated over the last 6 months.
-
@heidykhlaaf
Dr Heidy Khlaaf
on x
This is exactly what I mean when I say that the use of AI in the military is no different from indiscriminate bombing (or worse). To exploit the bias embedded within these systems, to then subsequently execute thousands of families based on these very biases, is beyond immoral.
-
@evanhill
Evan Hill
on x
Lavender and the policies implemented around it led the IDF to target alleged Hamas fighters even at the lowest level, to specifically strike them after they'd entered their family homes, and to authorize 15-20 civilian deaths per strike as collateral damage, +972 reports: [image…
-
@evanhill
Evan Hill
on x
The IDF authorized Lavender two weeks into the war after a check showed 90% accuracy identifying Hamas affiliation, +972 reports. The “brutal” tactic of striking homes that used to be limited to senior commanders and require complex investigation was now delegated to AI. [image]
-
@mrjoncryer
Jon Cryer
on x
I'm guessing nobody told these guys about Skynet
-
@thegreenebj
Bryce Greene
on x
This is easily the most bone chilling articles read in a long time. Naming an AI system that kills people in their homes “Where's Daddy?” is just evil Nazi shit. This is the dystopia we all had nightmares about but Israel embraces it and made it a reality.
-
@madamajab
@madamajab
on x
This entire article is a horror show. [image]
-
@jeremiahdjohns
Jeremiah Johnson
on x
This is absolutely horrifying. Knowingly killing hundreds of civilians to get a single target is a war crime. More disturbing is that the Israeli sources basically don't seem ashamed of it in the slightest. [image]
-
@hermit_hwarang
@hermit_hwarang
on x
“The army also decided during the first weeks of the war that, for every junior Hamas operative that Lavender marked, it was permissible to kill up to 15 or 20 civilians...the army authorized the killing of more than 100 civilians in the assassination of a single commander.”
-
@mer__edith
Meredith Whittaker
on x
AI IDing human targets in Gaza based on probabilistic stereotypes (37K people!). People familiar w/AI know how inaccurate such assessments are. Targets' homes are then bombed at night when they (their family, neighbors, pets) are most likely to be home https://www.972mag.com/... …
-
@rohantalbot
Rohan Talbot
on x
A few years ago we were talking about how Israel's use and abuse of AI-powered security tech was creating perpetual automated occupation. Then @amnesty evidenced Israel's automated apartheid. Now we are in the realms of automated genocide. The dystopia we all feared most.
-
@schock
Sasha Costanza-Chock
on x
This horrific article about the IDF kill list AI system called ‘lavender’ is an absolute must read for everyone who says they care about AI harms. Read the whole thing: https://www.972mag.com/...
-
@charlie533080
Charlie Herbert
on x
“The army also decided during the first weeks of the war that, for every junior Hamas operative that Lavender marked, it was permissible to kill up to 15 or 20 civilians.” And some people still think the conduct of this campaign is legitimate? Insane. https://www.972mag.com/...
-
@evanhill
Evan Hill
on x
The Israeli military is using an AI targeting program called Lavender that tagged around 37,000 Gazans as suspected militants, has around a 10% error rate, and led to systematic targeting of suspects in their family homes, +972 Magazine reports: https://www.972mag.com/...
-
@rnaudbertrand
Arnaud Bertrand
on x
This also raises immense questions on AI. We're having discussions on whether this or that AI may or may not be politically correct... and all the while Israel uses AI to automatize genocide. Sounds like the use of AI for military purposes is a much more urgent topic, and it...
-
@jakegodin
Jake Godin
on x
Dystopian read, from @yuval_abraham. “One source stated that human personnel often served only as a ‘rubber stamp’ for the decisions ... they would devote only about ‘20 seconds’ to each target before authorizing a bombing—just to make sure the Lavender-marked target is male.”
-
@tksshawa
Tariq Kenney-Shawa
on x
“Automated systems, including one called “Where's Daddy?” were used specifically to track the targeted individuals and carry out bombings when they had entered their family's residences.” Sickening. https://www.972mag.com/...
-
@yuval_abraham
Yuval Abraham
on x
I spoke with Israeli intelligence officers about the AI-based target machine they used which marked 37,000 Gazans as suspects for assassination. These whistleblowers expose numerous machines & policies that killed thousands of civilians since October. https://www.972mag.com/...
-
@rnaudbertrand
Arnaud Bertrand
on x
This is without a doubt one of the most important pieces of reporting on Gaza, and by far one of the most disturbing: https://www.972mag.com/... All by Israeli journalist @yuval_abraham based on whistleblower accounts from within the IDF and intelligence agencies. Israel has...
-
@bcmerchant
Brian Merchant
on x
This story is horrific, and basically confirms every one of the fears we had back in October when we knew the IDF was using AI—it's error-prone, there's scant human oversight, and it facilitates the mass rubber stamping of targets, of mass killing. https://www.972mag.com/...
-
@adhaque110
Adil Haque
on x
The nightmare of every international humanitarian lawyer come to life. [image]
-
@rosen_br
Brianna Rosen
on x
The IDF response to news about its AI “Lavender” system is hardly reassuring. Confirms Israeli intelligence is using AI tools for target identification, but not what review processes are in place to ensure thorough human vetting and accountability. https://www.theguardian.com/ ..…
-
@972mag
@972mag
on x
‘Lavender’: The AI machine directing Israel's bombing spree in Gaza. An investigation by @yuval_abraham, in partnership with @mekomit. https://www.972mag.com/...
-
@robertdownen_
Robert Downen
on x
Stunning and must-read report on Lavendar, an AI machine that is deciding where the Israeli military bombs. In the first weeks of the war, Lavender clocked as many as 37,000 Palestinians as suspected militants — and their homes — for possible air strikes.
-
@ggreenwald
Glenn Greenwald
on x
Israel's “unprecedented” calculations for how many Palestinian civilians the IDF is authorized to kill in order to kill a single suspected Hamas operative: 10-15 civilians for every junior operative. 100 civilians for a single “senior” commander. https://www.972mag.com/...
-
@arictoler
Aric Toler
on x
I want to screenshot basically the entire article, but this part in particular, wow. [image]
-
@evkontorovich
Eugene Kontorovich
on x
Fascinating story about Israel's use of AI to select targets in Gaza battlefield - system has accuracy of over 90%, impressive given it is used against illegal combatants w/out uniforms. The article seems to suggest the system is problematic, but I would think 90% accuracy in...
-
@elivalley
@elivalley
on x
Israel used an automated system called “Where's Daddy?” to ensure it killed AI-designated targets while the targets were at home with their spouses and children: https://www.972mag.com/...
-
@yanisvaroufakis
Yanis Varoufakis
on x
Have they lost their minds, along with their humanity? “I would invest 20 seconds for each target at this stage, and do dozens of them every day. I had zero added-value as a human, apart from being a stamp of approval. It saved a lot of time.” https://www.theguardian.com/ ...
-
@arictoler
Aric Toler
on x
The AI system also targeted minors https://www.972mag.com/... [image]
-
r/Foodforthought
r
on reddit
‘Lavender’: The AI machine directing Israel's bombing spree in Gaza
-
r/ABoringDystopia
r
on reddit
‘Lavender’: The AI machine directing Israel's bombing spree in Gaza
-
r/technews
r
on reddit
‘The machine did it coldly’: Israel used AI to identify 37,000 Hamas targets
-
r/PrepperIntel
r
on reddit
" The machine did it coldly': Israel used AI to identify 37,000 Hamas targets
-
r/Israel_Palestine
r
on reddit
‘Lavender’: The AI machine directing Israel's bombing spree in Gaza
-
r/DemocraticSocialism
r
on reddit
‘Lavender’: The AI machine directing Israel's bombing spree in Gaza
-
r/boringdystopia
r
on reddit
‘The machine did it coldly’: Israel used AI to identify 37,000 Hamas targets
-
r/OrphanCrushingMachine
r
on reddit
‘The machine did it coldly’: Israel used AI to identify 37,000 Hamas targets
-
r/worldevents
r
on reddit
‘The machine did it coldly’: Israel used AI to identify 37,000 Hamas targets
-
r/internationalpolitics
r
on reddit
‘The machine did it coldly’: Israel used AI to identify 37,000 Hamas targets
-
r/technology
r
on reddit
‘The machine did it coldly’: Israel used AI to identify 37,000 Hamas targets • Israeli intelligence sources reveal use of ‘Lavender’ system …
-
r/metalgearsolid
r
on reddit
War has changed. Israel used AI to identify 37,000 Hamas targets.
-
r/PLTR
r
on reddit
" The machine did it coldly': Israel used AI to identify 37,000 Hamas targets
-
r/stupidpol
r
on reddit
‘Lavender’: The AI machine directing Israel's bombing spree in Gaza
-
r/france
r
on reddit
‘The machine did it coldly’: Israel used AI to identify 37,000 Hamas targets | Israel-Gaza war
-
r/theworldnews
r
on reddit
‘The machine did it coldly’: Israel used AI to identify 37,000 Hamas targets | Israel-Gaza war | The Guardian
-
r/LateStageCapitalism
r
on reddit
‘The machine did it coldly’: Israel used AI to identify 37,000 Hamas targets
-
r/singularity
r
on reddit
‘The machine did it coldly’: Israel used AI to identify 37,000 Hamas targets | Israel-Gaza war | The Guardian
-
r/TrueAnon
r
on reddit
“Additional automated systems, including one called “Where's Daddy?” also revealed here for the first time, were used specifically to track …
-
r/Futurology
r
on reddit
" The machine did it coldly': Israel used AI to identify 37,000 Hamas targets
-
r/VaushV
r
on reddit
‘The machine did it coldly’: Israel used AI to identify 37,000 Hamas targets
-
r/InternationalNews
r
on reddit
‘Lavender’: The AI machine directing Israel's bombing spree in Gaza
-
r/InternationalNews
r
on reddit
‘The machine did it coldly’: Israel used AI to identify 37,000 Hamas targets
-
r/worldnews
r
on reddit
‘The machine did it coldly’: Israel used AI to identify 37,000 Hamas targets | Israel-Gaza war | The Guardian