2024-03-17
Ars Technica
5 related
Researchers detail ArtPrompt, a jailbreak that uses ASCII art to elicit harmful responses from aligned LLMs such as GPT-3.5, GPT-4, Gemini, Claude, and Llama2
Gist is, the models … Kit Eaton / Inc.com : Low-Tech Computer Art Foils Cutting-Edge AI Safety Systems Mastodon: Matthew Lyon / @mattly@hachyderm.io : contemporary cyberpunk is jailbreaking LLM chatbo...
Loading articles...