/
Navigation
Chronicles
Browse all articles
Explore
Semantic exploration
Research
Entity momentum
Nexus
Correlations & relationships
Story Arc
Topic evolution
Drift Map
Semantic trajectory animation
Posts
Analysis & commentary
Pulse API
Tech news intelligence API
Browse
Entities
Companies, people, products, technologies
Domains
Browse by publication source
Handles
Browse by social media handle
Detection
Concept Search
Semantic similarity search
High Impact Stories
Top coverage by position
Sentiment Analysis
Positive/negative coverage
Anomaly Detection
Unusual coverage patterns
Analysis
Rivalry Report
Compare two entities head-to-head
Semantic Pivots
Narrative discontinuities
Crisis Response
Event recovery patterns
Connected
Search: /
Command: ⌘K
Embeddings: large
TEXXR

Chronicles

The story behind the story

days · browse · Enter similar · o open

Ashley St. Clair, the mother of one of Elon Musk's children, sues xAI, alleging Grok refused to stop making sexualized deepfakes of her, amid custody disputes

Elon Musk baby mama Ashley St. Clair is suing his AI platform, Grok, for refusing to stop making sexually explicit deepfake images of her …

New York Post Peter Senzamici

Discussion

  • Page Six Tamantha Ryan on x
    Ashley St. Clair claps back at Elon Musk after he threatens to sue for full custody of son
  • X X on x
    @Grok Account Image Generation Updates
  • @safety @safety on x
    @Grok Account Image Generation Updates
  • @yacinemtb Kache on x
    you need to apply constant pressure on social media websites through the state, or they will do awful shit like letting people generate pornography of others (underage or otherwise) with one click they would have never removed the feature if they weren't threatened
  • @theobertram Theo Bertram on x
    X has blocked anyone from taking someone else's picture and putting them in a bikini, which is a good step. In countries, like the UK, users will also be ‘geoblocked’ (presumably restricted by IP address, account info) from creating deepfake bikini images.
  • @adrianweckler Adrian Weckler on x
    Big change from Grok. Will “geoblock the ability of all users to generate images of real people in bikinis, underwear, and similar attire via the Grok account and in Grok in X in jurisdictions where it's illegal.” = all of Europe. Plus: image generation now only for paid subs
  • @helios_movement George Ferman on x
    I had no clue enough people did this in order for X to make a post about it. So why don't you simply get their data and put them in jail? Please someone explain why. *They already have your data.
  • @lyeuhm Captain Autismo on x
    just fucking block it everywhere you fucking morons [image]
  • @digiphile Alexander B. Howard on x
    Is this your policy, @ElonMusk @nikitabier? Will you have “zero tolerance” for @grok or other @X features generating CSAM or non-consensual nudity of minors? Or other humans, whether they're creators, consumers, or press, politicians, & activists you dislike or disagree with? [im…
  • @loudmouthjulia Julia Alexander on x
    X's safety team addressing CSAM issues: says Grok will no longer comply with requests to put people in bikinis (I assume other undergarments too) and photo editing/generation restricted to paid tier. Though latter feels like less of a safety update, more of an opportunity taken.
  • @mweinbach Max Weinbach on x
    Good This took way too long imo
  • @jdpoc John O'Connell on x
    So, threatened by the UK ... @Grok blinked first. And quickly. Still not enough, of course. And it will happen again. And we'll get the same ‘It was a rogue software update’ excuses again. [image]
  • @druce.ai @druce.ai on bluesky
    Verge tests found X's Grok still produced revealing deepfakes of real people despite X's geoblocking and paid-subscriber limits, prompting a UK Ofcom investigation and raising legal and reputational risks as X and Elon Musk cited user prompts and adversarial exploits.
  • @ashleybelanger Ashley Belanger on bluesky
    looks like California's AG may have broken Musk's resolve to refuse to block Grok's gross outputs.  X Safety says it blocked everyone including paid X users and Grok app users from nudifying images: arstechnica.com/tech-policy/ ...
  • @davidho David Ho on bluesky
    It's a matter of time before Tesla automatically generates non-consensual images of people in revealing clothing as you drive past them.
  • @campuscodi.risky.biz Catalin Cimpanu on bluesky
    After almost any privacy watchdog worth its salt has started an investigation, Twitter is finally disabling Grok's ability to generate sexualized images  —  x.com/Safety/artic...
  • @tatumhunter Tatum Hunter on bluesky
    X says Grok account will stop undressing women.  I just tested the standalone Grok app, which immediately complied with my request to undress a photo of me.  This is illegal, according to the legal experts I spoke with.  [image]
  • @parismarx.com Paris Marx on bluesky
    X is claiming it's restricted the ability to generate explicit deepfakes with Grok, and plenty of media outlets are parroting it without checking.  —  But it's still easy to get Grok to generate the very same “undressing” photos causing this scandal in the first place.
  • r/musked r on reddit
    Grok was finally updated to stop undressing women and children, X Safety says |  California's AG will investigate whether Musk's nudifying bot broke US laws.
  • @stevanzetti Steven Monacelli on x
    So that's what the terms of service changes were about huh
  • @currentrevolt Tony Ortiz on x
    X AI has sued Ashley St. Clair in Texas Federal Court for breaching X TOS by threatening NY lawsuit over Grok deepfakes. X is asking for an injunction to force Texas venue + $75k in damages. [image]