A look at “slopsquatting”, a supply chain attack where threat actors make malicious packages on indexes using AI-hallucinated names resembling popular libraries
Good morning. Arguments start today … arXiv : We Have a Package for You! A Comprehensive Analysis of Package Hallucinations by Code Generating LLMs arXiv.org e-Print archive : We Have a Package for You! A Comprehensive Analysis of Package Hallucinations by Code Generating LLMs Bluesky: Laurie Voss / @seldo.com : Every time I read about a novel supply chain attack on npm - which is like every week - I thank the stars it's not my job anymore. www.bleepingcomputer.com/news/ securit... @spavel : Can vibe coding increase velocity? — Yes, in fact it speeds up the entire product lifecycle, from first deployment to embarrassing hack that steals all your customers' money and leads to the company being shut down for gross negligence. Ally Tibbitt / @allytibbitt.me : “What a world we live in: AI hallucinated packages are validated and rubber-stamped by another AI that is too eager to be helpful.” — www.theregister.com/AMP/2025/04/ ... Andrey Sitnik / @en.sitnik.ru : Attacks on vibe-coding have begun. LLMs sometimes hallucinate and install non-existent packages. — As a result, attackers have started publishing malicious packages under these hallucinated names, which frequently appear in AI-generated suggestions. — socket.dev/blog/slopsqu... [image] Seth Michael Larson / @sethmlarson.dev : Do I have to add “coined slopsquatting” to my resume now? 🤣 Forums: r/technology : LLMs can't stop making up software dependencies and sabotaging everything r/netsec : We Have a Package for You! A Comprehensive Analysis of Package Hallucinations by Code Generating LLMs r/programming : AI code suggestions sabotage software supply chain See also Mediagazer