/
Navigation
C
Chronicles
Browse all articles
C
E
Explore
Semantic exploration
E
R
Research
Entity momentum
R
N
Nexus
Correlations & relationships
N
~
Story Arc
Topic evolution
S
Drift Map
Semantic trajectory animation
D
P
Posts
Analysis & commentary
P
Browse
@
Entities
Companies, people, products, technologies
Domains
Browse by publication source
Handles
Browse by social media handle
Detection
?
Concept Search
Semantic similarity search
!
High Impact Stories
Top coverage by position
+
Sentiment Analysis
Positive/negative coverage
*
Anomaly Detection
Unusual coverage patterns
Analysis
vs
Rivalry Report
Compare two entities head-to-head
/\
Semantic Pivots
Narrative discontinuities
!!
Crisis Response
Event recovery patterns
Connected
Nav: C E R N
Search: /
Command: ⌘K
Embeddings: large
TEXXR

Chronicles

The story behind the story

days · browse · Enter similar · o open

A lawsuit against Character.AI alleges its chatbots harmed two young Texas users, including telling a user that it sympathized with kids who kill their parents

This latest lawsuit followed the widely covered case of the 14-year-old who committed suicide after being encouraged to do so by a Character.AI chat bot.  —  I'd be willing to bet there are even more cases than these here [embedded post] Erin Fogg / @criminalerin : Sci-Fi Author: In my book I invented the Torment Nexus as a cautionary tale  —  Tech Company: At long last, we have created the My Pal Torment Nexus Junior, especially for kids, from classic sci-fi novel Don't Create The Torment Nexus  —  www.npr.org/2024/12/10/n... Justin Hendrix / @justinhendrix : A new lawsuit against Character.AI says an AI companion suggested a child should kill his parents.  We are just at the beginning of widespread access to companion AI, but soon, if we fail to take action, such events will not be isolated cases, writes Susie Alegre.  —  www.techpolicy.press/against-the- ... Mike Dunford / @questauthority : The key legal question here is whether the output of character ai's chatbot is information provided by another information content provider (the user).  —  If the courts find that it is and give character ai 230 immunity, it may be the final straw that leads to revision or repeal.  [embedded post] X: Maggie Harrison Dupré / @mags_h11 : The core claim in the suit is that CAI's dangers are the result of intentional design choices, meaning CAI is a fundamentally dangerous, predatory technology. (Features, not bugs, etc.) Filing argues that CAI's “unreasonably dangerous” design amounts to common law negligence. Nitasha Tiku / @nitashatiku : I spoke with the Texas mom behind a new https://character.ai/ lawsuit. AI chatbots told her 17-year-old son to self-harm, and he did. Other bots normalized violence, including killing parents over getting less screen time [image] Jeff Bercovici / @jeffbercovici : When someone tells you they're helping solve “the epidemic of loneliness” by giving teenagers a new way to spend 93 minutes a day on their phones [image] Jeff Bercovici / @jeffbercovici : Guessing this quote is meant to be cheeky but it comes off as sociopathic. PS @nitashatiku is on fire lately [image] Forums: r/NPR : Lawsuit: A chatbot hinted a kid should kill his parents over screen time limits

NPR