Microsoft appears to have updated the new Bing to end chats following prompts mentioning “feelings” or “Sydney”, after adding chat restrictions on February 17
but the human response is way scarier Russell Kidson / gHacks Technology News : The Problem of Other Minds: Assessing the Sentience of Chatbots like Bing Eric Hal Schwartz / Voicebot.ai : Microsoft Re...
A Microsoft forum post from November 23, 2022 describes AI chatbot Sydney “misbehaving” and being “so rude”, suggesting the company knew about its quirks
A new discovery that makes a curious story a whole lot more curious — We all know by now just how off the rails Bing can get.
A Stanford student used a prompt injection attack to reveal Bing Chat's codename Sydney and its initial prompt that governs how the service interacts with users
Contents: — 🏈 No Crypto Super Bowl Ads Slashdot : Bing Chat Succombs to Prompt Injection Attack, Spills Its Secrets Tweets: @noahsussman : Failure to sanitize inputs shows a lack of basic web dev kno...
How Elon Musk's Twitter moderation cuts impacted large markets outside the US, like India, with more hate speech, and Japan, with less political trending topics
With 75 percent of its audience outside the U.S. and Canada, the impact of Elon Musk's moderation cuts has been great elsewhere Mastodon: @MsAnthropist@sfba.social and @JosephMenn@infosec.exchange Twe...