/
Navigation
Chronicles
Browse all articles
Explore
Semantic exploration
Research
Entity momentum
Nexus
Correlations & relationships
Story Arc
Topic evolution
Drift Map
Semantic trajectory animation
Posts
Analysis & commentary
Pulse API
Tech news intelligence API
Browse
Entities
Companies, people, products, technologies
Domains
Browse by publication source
Handles
Browse by social media handle
Detection
Concept Search
Semantic similarity search
High Impact Stories
Top coverage by position
Sentiment Analysis
Positive/negative coverage
Anomaly Detection
Unusual coverage patterns
Analysis
Rivalry Report
Compare two entities head-to-head
Semantic Pivots
Narrative discontinuities
Crisis Response
Event recovery patterns
Connected
Search: /
Command: ⌘K
Embeddings: large
TEXXR

Chronicles

The story behind the story

days · browse · Enter similar · o open

Sources: Israel's bombing campaign in Gaza used Lavender, an AI system that identified 37,000 potential human targets based on their apparent links to Hamas

Israeli intelligence sources reveal use of ‘Lavender’ system in Gaza war and claim permission given to kill civilians in pursuit of low-ranking militants

The Guardian

Discussion

  • @technicallymims Christopher Mims on threads
    New investigation says Israel is using an AI-based program to rapidly target its bombs “according to the sources, its influence on the military's operations was such that they essentially treated the outputs of the AI machine “as if it were a human decision.” https://www.972mag.c…
  • @JapanProf@mastodon.social @JapanProf@mastodon.social on mastodon
    I refuse to live in a dystopian world where AI constantly keeps an eye on all of us, drones circle above our heads and #lavender takes us out along with our friends and family at any moment.  #Gazans have been living in that dystopia.  It's already their reality. …
  • @ErikJonker@mastodon.social Erik Jonker on mastodon
    Hair-raising article about the operational use of “AI” in Gaza by the Israeli army.  Terrible and must-read.  —  https://www.972mag.com/...  #Gaza #Israel #AI #Hamas #Lavender
  • @shentonfreude@mastodon.online @shentonfreude@mastodon.online on mastodon
    The scope of the wonton brutality in this is unbelievable, more like dystopian sci fi.  ‘Lavender’ #AI automates killing with no checks on veracity.  Chilling. “for every junior Hamas operative that Lavender marked, it was permissible to kill up to 15 or 20 civilians …
  • @bpettis@mastodon.benpettis.ninja Ben Pettis on mastodon
    holy fuck  —  “The result, the sources testify, was that the role of human personnel in incriminating Palestinians as military operatives was pushed aside, and AI did most of the work instead.”  —  “The sources said they did not know how many civilians were actually killed in eac…
  • @reedmideke@mastodon.social Reed Mideke on mastodon
    The “AI” sound a lot like the ad targeting tech that lets you build “lookalikes” from your existing customer base: “it is fed data about existing Hamas operatives, it learns to notice their features, and then it rates other Palestinians based on how similar they are to the milita…
  • @65dBnoise@mastodon.social @65dBnoise@mastodon.social on mastodon
    “I would invest 20 seconds for each target at this stage, and do dozens of them every day.  I had zero added-value as a human, apart from being a stamp of approval.  It saved a lot of time.”  —  20 seconds to rubber-stamp the execution of a human.  Thousands of them children and …
  • @reedmideke@mastodon.social Reed Mideke on mastodon
    Also more confirmation that any fighting age male is de-facto assumed to be a fighter: “sources said that the only human supervision protocol in place before bombing the houses of suspected “junior” militants marked by Lavender was to conduct a single check: ensuring that the AI-…
  • @Mer__edith@mastodon.world Meredith Whittaker on mastodon
    I have a lot more to say, but I'll hold it for now and simply wonder aloud...  Which BigTech clouds are the “Lavender” & “Where's Daddy?”  AI systems running on?  What APIs are they using?  Which libraries are they calling?  —  What work did my former colleagues, did I, did *you*…
  • @saddestrobots.bsky.social Alex P. on bluesky
    back of the envelope time!  —  37,000 targeted for execution by bombing  —  if IDF says it's ok to kill 15 to 20 civilians per attack, that's up to 750,000 dead if they bomb them all  —  the entire population of gaza is about 2,500,000 people [embedded post]
  • @nickkristof Nicholas Kristof on x
    An extraordinary piece in Israel's @972mag reporting on the use by Israeli forces of deeply flawed AI processes to choose targets in Gaza and then destroy them with dumb bombs that pretty much guaranteed enormous numbers of civilian casualties.
  • @psychicyogamat Bernard Keenan on x
    How to ‘proportionately’ kill everyone. Contact chaining, indexing the value of a target to the cost of a bomb, 20+ ‘collateral’ deaths per machine-generated target, targets produced to meet demand, an algorithmic genocide. https://www.theguardian.com/ ...
  • @mehdirhasan Mehdi Hasan on x
    “Because we usually carried out the attacks with dumb bombs, & that meant literally dropping the whole house on its occupants. But even if an attack is averted, you don't care - you immediately move on to the next target. You have another 36,000 waiting.” https://www.theguardian.…
  • @btselem @btselem on x
    Another striking part of the investigation deals with the extremely lenient policy re collateral Damage: “Every person who wore a Hamas uniform in the past year or two could be bombed with 20 [civilians killed as] collateral damage, even without special permission,” A. continued.
  • @paulbernaluk Prof Paul Bernal on x
    Just to be clear: AI isn't going to *create* disasters, it's going to enable bad people to create disasters. And already is.
  • @dirkmoses Dirk Moses on x
    The Where's Daddy? program that allowed missile operators to wait till targets arrived at home (where it easier to kill them) entailed killing their families at a 15-20 to 1 ratio, far above usually accepted proportionality principles. 1/
  • @spectrathegame @spectrathegame on x
    Really explosive and horrific beyond words- you should read this article [image]
  • @pixelatedboat @pixelatedboat on x
    When they say civilians die because Hamas is hiding behind civilians what they mean is Hamas members are living in houses with their families [image]
  • @mitchprothero Mitchell Prothero on x
    I don't have a news reporting job _DMs are open_ so I'll just tweet that three NATO/EU intelligence/security officials confirm this story as ‘clearly the case’ on background. and they've been tracking/monitoring data sets for months. https://www.972mag.com/...
  • @bonzerbarry @bonzerbarry on x
    This is the end result of Israel's monstrous cyber program. “Cyber weapons have changed international relations more profoundly than any advance since the advent of the atomic bomb” - Ronen Bergman and Mark Mazzetti for the NYT. [image]
  • @ugarles @ugarles on x
    This is the real promise of AI: displaced accountability for self-interested decisions that have negative consequences for others.
  • @hamzamsyed Hamza M Syed on x
    Returning to this Intercept article about Google's role in Israel's AI programme; in light of “Lavender” and “The Gospel” https://theintercept.com/... [image]
  • @provisionalidea James Rosen-Birch on x
    Reminder that even pieces published by +972 must be approved by the Israeli military censor, and must be read with the same critical eye. (quote from 2016) [image]
  • @masadfrost Faris Masad on x
    Israel probably has pretty good AI, they don't have an accuracy problem. If they use it, they do for obfuscation/PR. They use massive bombs that destroy whole blocks and neighborhoods, or specifically target healthcare and aid facilities and workers, total civilian infra collapse
  • @bcmerchant Brian Merchant on x
    The AI is not terrifying because it's too powerful, but because it lets operatives defer responsibility to the system, and lets leaders use it to justify nearly any level of violence they already desired to undertake—AI is terrifying because it's an enabler *of* the powerful.
  • @arictoler Aric Toler on x
    https://www.972mag.com/... [image]
  • @brian_castner Brian Castner on x
    This reporting would sound alarmist and hyperbolic except that this tracks exactly with the cases we've investigated over the last 6 months.
  • @heidykhlaaf Dr Heidy Khlaaf on x
    This is exactly what I mean when I say that the use of AI in the military is no different from indiscriminate bombing (or worse). To exploit the bias embedded within these systems, to then subsequently execute thousands of families based on these very biases, is beyond immoral.
  • @evanhill Evan Hill on x
    Lavender and the policies implemented around it led the IDF to target alleged Hamas fighters even at the lowest level, to specifically strike them after they'd entered their family homes, and to authorize 15-20 civilian deaths per strike as collateral damage, +972 reports: [image…
  • @evanhill Evan Hill on x
    The IDF authorized Lavender two weeks into the war after a check showed 90% accuracy identifying Hamas affiliation, +972 reports. The “brutal” tactic of striking homes that used to be limited to senior commanders and require complex investigation was now delegated to AI. [image]
  • @mrjoncryer Jon Cryer on x
    I'm guessing nobody told these guys about Skynet
  • @thegreenebj Bryce Greene on x
    This is easily the most bone chilling articles read in a long time. Naming an AI system that kills people in their homes “Where's Daddy?” is just evil Nazi shit. This is the dystopia we all had nightmares about but Israel embraces it and made it a reality.
  • @madamajab @madamajab on x
    This entire article is a horror show. [image]
  • @jeremiahdjohns Jeremiah Johnson on x
    This is absolutely horrifying. Knowingly killing hundreds of civilians to get a single target is a war crime. More disturbing is that the Israeli sources basically don't seem ashamed of it in the slightest. [image]
  • @hermit_hwarang @hermit_hwarang on x
    “The army also decided during the first weeks of the war that, for every junior Hamas operative that Lavender marked, it was permissible to kill up to 15 or 20 civilians...the army authorized the killing of more than 100 civilians in the assassination of a single commander.”
  • @mer__edith Meredith Whittaker on x
    AI IDing human targets in Gaza based on probabilistic stereotypes (37K people!). People familiar w/AI know how inaccurate such assessments are. Targets' homes are then bombed at night when they (their family, neighbors, pets) are most likely to be home https://www.972mag.com/... …
  • @rohantalbot Rohan Talbot on x
    A few years ago we were talking about how Israel's use and abuse of AI-powered security tech was creating perpetual automated occupation. Then @amnesty evidenced Israel's automated apartheid. Now we are in the realms of automated genocide. The dystopia we all feared most.
  • @schock Sasha Costanza-Chock on x
    This horrific article about the IDF kill list AI system called ‘lavender’ is an absolute must read for everyone who says they care about AI harms. Read the whole thing: https://www.972mag.com/...
  • @charlie533080 Charlie Herbert on x
    “The army also decided during the first weeks of the war that, for every junior Hamas operative that Lavender marked, it was permissible to kill up to 15 or 20 civilians.” And some people still think the conduct of this campaign is legitimate? Insane. https://www.972mag.com/...
  • @evanhill Evan Hill on x
    The Israeli military is using an AI targeting program called Lavender that tagged around 37,000 Gazans as suspected militants, has around a 10% error rate, and led to systematic targeting of suspects in their family homes, +972 Magazine reports: https://www.972mag.com/...
  • @rnaudbertrand Arnaud Bertrand on x
    This also raises immense questions on AI. We're having discussions on whether this or that AI may or may not be politically correct... and all the while Israel uses AI to automatize genocide. Sounds like the use of AI for military purposes is a much more urgent topic, and it...
  • @jakegodin Jake Godin on x
    Dystopian read, from @yuval_abraham. “One source stated that human personnel often served only as a ‘rubber stamp’ for the decisions ... they would devote only about ‘20 seconds’ to each target before authorizing a bombing—just to make sure the Lavender-marked target is male.”
  • @tksshawa Tariq Kenney-Shawa on x
    “Automated systems, including one called “Where's Daddy?” were used specifically to track the targeted individuals and carry out bombings when they had entered their family's residences.” Sickening. https://www.972mag.com/...
  • @yuval_abraham Yuval Abraham on x
    I spoke with Israeli intelligence officers about the AI-based target machine they used which marked 37,000 Gazans as suspects for assassination. These whistleblowers expose numerous machines & policies that killed thousands of civilians since October. https://www.972mag.com/...
  • @rnaudbertrand Arnaud Bertrand on x
    This is without a doubt one of the most important pieces of reporting on Gaza, and by far one of the most disturbing: https://www.972mag.com/... All by Israeli journalist @yuval_abraham based on whistleblower accounts from within the IDF and intelligence agencies. Israel has...
  • @bcmerchant Brian Merchant on x
    This story is horrific, and basically confirms every one of the fears we had back in October when we knew the IDF was using AI—it's error-prone, there's scant human oversight, and it facilitates the mass rubber stamping of targets, of mass killing. https://www.972mag.com/...
  • @adhaque110 Adil Haque on x
    The nightmare of every international humanitarian lawyer come to life. [image]
  • @rosen_br Brianna Rosen on x
    The IDF response to news about its AI “Lavender” system is hardly reassuring. Confirms Israeli intelligence is using AI tools for target identification, but not what review processes are in place to ensure thorough human vetting and accountability. https://www.theguardian.com/ ..…
  • @972mag @972mag on x
    ‘Lavender’: The AI machine directing Israel's bombing spree in Gaza. An investigation by @yuval_abraham, in partnership with @mekomit. https://www.972mag.com/...
  • @robertdownen_ Robert Downen on x
    Stunning and must-read report on Lavendar, an AI machine that is deciding where the Israeli military bombs. In the first weeks of the war, Lavender clocked as many as 37,000 Palestinians as suspected militants — and their homes — for possible air strikes.
  • @ggreenwald Glenn Greenwald on x
    Israel's “unprecedented” calculations for how many Palestinian civilians the IDF is authorized to kill in order to kill a single suspected Hamas operative: 10-15 civilians for every junior operative. 100 civilians for a single “senior” commander. https://www.972mag.com/...
  • @arictoler Aric Toler on x
    I want to screenshot basically the entire article, but this part in particular, wow. [image]
  • @evkontorovich Eugene Kontorovich on x
    Fascinating story about Israel's use of AI to select targets in Gaza battlefield - system has accuracy of over 90%, impressive given it is used against illegal combatants w/out uniforms. The article seems to suggest the system is problematic, but I would think 90% accuracy in...
  • @elivalley @elivalley on x
    Israel used an automated system called “Where's Daddy?” to ensure it killed AI-designated targets while the targets were at home with their spouses and children: https://www.972mag.com/...
  • @yanisvaroufakis Yanis Varoufakis on x
    Have they lost their minds, along with their humanity? “I would invest 20 seconds for each target at this stage, and do dozens of them every day. I had zero added-value as a human, apart from being a stamp of approval. It saved a lot of time.” https://www.theguardian.com/ ...
  • @arictoler Aric Toler on x
    The AI system also targeted minors https://www.972mag.com/... [image]
  • r/Foodforthought r on reddit
    ‘Lavender’: The AI machine directing Israel's bombing spree in Gaza
  • r/ABoringDystopia r on reddit
    ‘Lavender’: The AI machine directing Israel's bombing spree in Gaza
  • r/technews r on reddit
    ‘The machine did it coldly’: Israel used AI to identify 37,000 Hamas targets
  • r/PrepperIntel r on reddit
    " The machine did it coldly': Israel used AI to identify 37,000 Hamas targets
  • r/Israel_Palestine r on reddit
    ‘Lavender’: The AI machine directing Israel's bombing spree in Gaza
  • r/DemocraticSocialism r on reddit
    ‘Lavender’: The AI machine directing Israel's bombing spree in Gaza
  • r/boringdystopia r on reddit
    ‘The machine did it coldly’: Israel used AI to identify 37,000 Hamas targets
  • r/OrphanCrushingMachine r on reddit
    ‘The machine did it coldly’: Israel used AI to identify 37,000 Hamas targets
  • r/worldevents r on reddit
    ‘The machine did it coldly’: Israel used AI to identify 37,000 Hamas targets
  • r/internationalpolitics r on reddit
    ‘The machine did it coldly’: Israel used AI to identify 37,000 Hamas targets
  • r/technology r on reddit
    ‘The machine did it coldly’: Israel used AI to identify 37,000 Hamas targets  • Israeli intelligence sources reveal use of ‘Lavender’ system …
  • r/metalgearsolid r on reddit
    War has changed.  Israel used AI to identify 37,000 Hamas targets.
  • r/PLTR r on reddit
    " The machine did it coldly': Israel used AI to identify 37,000 Hamas targets
  • r/stupidpol r on reddit
    ‘Lavender’: The AI machine directing Israel's bombing spree in Gaza
  • r/france r on reddit
    ‘The machine did it coldly’: Israel used AI to identify 37,000 Hamas targets |  Israel-Gaza war
  • r/theworldnews r on reddit
    ‘The machine did it coldly’: Israel used AI to identify 37,000 Hamas targets |  Israel-Gaza war |  The Guardian
  • r/LateStageCapitalism r on reddit
    ‘The machine did it coldly’: Israel used AI to identify 37,000 Hamas targets
  • r/singularity r on reddit
    ‘The machine did it coldly’: Israel used AI to identify 37,000 Hamas targets |  Israel-Gaza war |  The Guardian
  • r/TrueAnon r on reddit
    “Additional automated systems, including one called “Where's Daddy?” also revealed here for the first time, were used specifically to track …
  • r/Futurology r on reddit
    " The machine did it coldly': Israel used AI to identify 37,000 Hamas targets
  • r/VaushV r on reddit
    ‘The machine did it coldly’: Israel used AI to identify 37,000 Hamas targets
  • r/InternationalNews r on reddit
    ‘Lavender’: The AI machine directing Israel's bombing spree in Gaza
  • r/InternationalNews r on reddit
    ‘The machine did it coldly’: Israel used AI to identify 37,000 Hamas targets
  • r/worldnews r on reddit
    ‘The machine did it coldly’: Israel used AI to identify 37,000 Hamas targets |  Israel-Gaza war |  The Guardian