/
Navigation
Chronicles
Browse all articles
Explore
Semantic exploration
Research
Entity momentum
Nexus
Correlations & relationships
Story Arc
Topic evolution
Drift Map
Semantic trajectory animation
Posts
Analysis & commentary
Pulse API
Tech news intelligence API
Browse
Entities
Companies, people, products, technologies
Domains
Browse by publication source
Handles
Browse by social media handle
Detection
Concept Search
Semantic similarity search
High Impact Stories
Top coverage by position
Sentiment Analysis
Positive/negative coverage
Anomaly Detection
Unusual coverage patterns
Analysis
Rivalry Report
Compare two entities head-to-head
Semantic Pivots
Narrative discontinuities
Crisis Response
Event recovery patterns
Connected
Search: /
Command: ⌘K
Embeddings: large
TEXXR

Chronicles

The story behind the story

days · browse · Enter similar · o open

On top of nonconsensual porn images, X users seem to be using Grok to alter images to depict real women being sexually abused, humiliated, hurt, and even killed

Earlier this week, a troubling trend emerged on X-formerly-Twitter as people started asking Elon Musk's chatbot Grok to unclothe images of real people.

Futurism Maggie Harrison Dupré

Discussion

  • @snig Snigdha on bluesky
    Despite X's AI Grok admitting and apologising for creating sexualised images of children, xAI, X/Twitter, and Elon Musk have said nothing.  Do they hope if they ignore it, everyone will forget?  —  arstechnica.com/tech-policy/ ...
  • @caseynewton Casey Newton on bluesky
    Elon should go back to the MechaHitler version of Grok.  It was safer! [embedded post]
  • @stevepeers Steve Peers on bluesky
    It gets *worse* than the nude deepfakes.  —  The UK and EU have said nothing.  [embedded post]
  • @charlotte2153 Charlotte Nichols MP on bluesky
    There is, to my mind, no justification for the continued use by the UK Government of X as a platform for official comms.  There hasn't been for some time, in fact, but if the latest developments around AI-generated image abuse and CSAM don't change the policy I really don't know …
  • @mharrisondupre Maggie Harrison Dupré on bluesky
    A lot of this was directed at online models and sex workers, which is deeply troubling as sex workers already face a disproportionately high risk of violence and homicide.  —  Several users expressly asked Grok to make women “look scared” in the sexualized images.
  • @kincsobiro Kincso Biro on bluesky
    “... people started asking Elon Musk's chatbot Grok to unclothe images of real people.  This resulted in a wave of nonconsensual pornographic images flooding the largely unmoderated social media site, with some of the sexualized images even depicting minors.”  —  futurism.com/fut…