Women in rural India report experiencing trauma from data annotation work, which requires them to review violent content and porn for global tech companies
The Guardian Anuj Behal
Related Coverage
- India's female workers watching hours of abusive content to train AI Hacker News
- What kind of work is powering AI today? — This recent article from The Guardian exposes an invisible layer of the AI supply chain. … Denise Pang
- As always, it is the marginalized, particularly poor women, who bear the brunt of these issues. I can't help but wonder if investors in AI companies … Sasja Beslik
- Traumatic stress from ‘data annotation’ and content moderation is a cost that is not often discussed. … Maree Crabbe
- “There may be moderators who escape psychological harm, but I've yet to see evidence of that” — https://www.theguardian.com/ ... @acdha@code4lib.social · Chris Adams
- In case you were unaware that GenAI software developers are also exploiting and abusing workers in the global South to support their plausible word-mulch vending machines. — There is no explicit detail in this article, but you may find the implications of the experiences described distressing. … @JulietEMcKenna@wandering.shop · Juliet E McKenna
Discussion
-
r/india
r
on reddit
‘In the end, you feel blank’: India's female workers watching hours of abusive content to train AI
-
r/LudditeRenaissance
r
on reddit
‘In the end, you feel blank’: India's female workers watching hours of abusive content to train AI
-
r/technology
r
on reddit
‘In the end, you feel blank’: India's female workers watching hours of abusive content to train AI
-
@okhuijsen.nl
Stephan Okhuijsen
on bluesky
It's time we make it mandatory for an AI to have a statement like “No people were (mentally) harmed during the creation of this AI/LLM”. — Or the other way around: — “Warning, the creation of this AI caused serious harm to 40.000 people you will never see...”. — www.theguar…
-
@quendergeer
@quendergeer
on bluesky
every time i'm tempted to share AI content, even to mock it, I'm reminded of the human cost of the content moderation it's built on www.theguardian.com/global-devel...
-
@audhd-psychnp.com
@audhd-psychnp.com
on bluesky
“I had never imagined this would be part of the job,” she says. The material was graphic and relentless. When she raised concerns with her manager, she recalls being told: “This is God's work - you're keeping children safe.” — We don't need to be doing this. Vicarious trauma…
-
@matthewcobb
Matthew Cobb
on bluesky
I'm furious at AI because it steals our work, consumes vast resources and produces mediocrity, but along the way it also traumatises the people - mainly in the developing world - who are paid a pittance to train it. Theft, abuse, environmental destruction, all to make rich men r…
-
@metacurity.com
Cynthia Brumfield
on bluesky
Watching the worst that humanity has to offer all day long has become the work of female workers in developing nations, and AI is only accelerating the trauma. — “The first few months, I couldn't sleep,” she says. “I would close my eyes and still see the screen loading.” [emb…
-
r/BetterOffline
r
on reddit
‘In the end, you feel blank’: India's female workers watching hours of abusive content to train AI