/
Navigation
Chronicles
Browse all articles
Explore
Semantic exploration
Research
Entity momentum
Nexus
Correlations & relationships
Story Arc
Topic evolution
Drift Map
Semantic trajectory animation
Posts
Analysis & commentary
Pulse API
Tech news intelligence API
Browse
Entities
Companies, people, products, technologies
Domains
Browse by publication source
Handles
Browse by social media handle
Detection
Concept Search
Semantic similarity search
High Impact Stories
Top coverage by position
Sentiment Analysis
Positive/negative coverage
Anomaly Detection
Unusual coverage patterns
Analysis
Rivalry Report
Compare two entities head-to-head
Semantic Pivots
Narrative discontinuities
Crisis Response
Event recovery patterns
Connected
Search: /
Command: ⌘K
Embeddings: large
TEXXR

Chronicles

The story behind the story

days · browse · Enter similar · o open

At Microsoft's demo last week, the “new Bing” incorrectly summarized Gap's Q3 2022 earnings and provided wrong info on a pet vacuum and Mexico City nightlife

Microsoft knowingly released a broken product for short-term hype  —  Bing AI got some answers completely wrong during their demo.

DKB Blog Dmitri Brereton

Discussion

  • @movingtothesun Jon Uleis on x
    My new favorite thing - Bing's new ChatGPT bot argues with a user, gaslights them about the current year being 2022, says their phone might have a virus, and says “You have not been a good user” Why? Because the person asked where Avatar 2 is showing nearby https://twitter.com/..…
  • @vladquant Vlad on x
    Bing subreddit has quite a few examples of new Bing chat going out of control. Open ended chat in search might prove to be a bad idea at this time! Captured here as a reminder that there was a time when a major search engine showed this in its results. https://twitter.com/...
  • @jeffjarvis Jeff Jarvis on x
    1/This is devastating, not only about Bing & ChatGPT but about media's rush to hype Microsoft & tear down Google. 2/This is why I doubt the utility & credibility of chatAI and search. 3/ChatAI has many uses. Fighting yesterday's war is the wrong reflex. https://dkb.blog/...
  • @pkafka Peter Kafka on x
    Almost a full week for the ai-is-overhyped backlash to manifest. https://twitter.com/...
  • @chafkin Max Chafkin on x
    I don't think people have really grappled with how frequently these ai systems just make stuff up, nor with the problems this presents for search...and lots of other proposed applications https://twitter.com/...
  • @tomgara Tom Gara on x
    This is mostly notable Microsoft releasing a pretty faulty product to capture short term hype etc, but it's also an incredible visualization of how shonky the media coverage of this was last week - did a single outlet catch any of this?
  • @danshipper @danshipper on x
    It's completely fine for these early technologies to have rough edges, sometimes lie, and make things up. This will get better over time. But if even your demo is riddled with errors and you're planning a release to millions of people...you're risking quite a lot of credibility
  • @tomgara Tom Gara on x
    These examples of how the AI tool in Microsoft Office totally messed up summarizing GAP's financial results is pretty wild in context of Google's stock tanking over a mistake etc https://twitter.com/...
  • @firstadopter Tae Kim on x
    The list of errors the Bing A.I. chatbot makes is stunning. Worse it says the answers with confidence and certainty, so users don't know they are getting inaccurate information https://twitter.com/...
  • @robleathern Rob Leathern on x
    From @benthompson: “What is just as interesting is what this says about Microsoft: probably the most obvious explanation is that the company is so enthusiastic about this technology and is so eager to take on Google that they didn't do due diligence. What now, though?” https://tw…
  • @robleathern Rob Leathern on x
    “Google's Bard got an answer wrong during an ad, which everyone noticed. Now the narrative is “Google is rushing to catch up to Bing and making mistakes!”. That would be a fine narrative if Bing didn't make even worse mistakes during its own demo.” https://twitter.com/...
  • @harrymccracken Harry McCracken on x
    So far I haven't heard anyone releasing generative AI-based search tools really confront just how error-ridden they are in their present form. https://dkb.blog/...
  • @jamespoulos James Poulos on x
    Remember kids it's not artificial intelligence it's automated simulation https://twitter.com/...
  • @sonnybunch Sonny Bunch on x
    Tormenting AI into existence and having it launch the nukes just so we would leave it alone feels like a fitting end to humanity. https://twitter.com/...
  • @benedictevans Benedict Evans on x
    All of this is reminding me just the tiniest bit of Hololens https://twitter.com/...
  • @danshipper Dan Shipper on x
    If you want to see how the AI hype trend might cool, look no further than this article https://dkb.blog/...
  • @benedictevans Benedict Evans on x
    I am hearing disturbing rumours that an AI system trained on the way that people behave on the Internet is pedantic, petty, starts fights over trivial things, and has a strong tendency to bullshit. https://twitter.com/...
  • @edzitron Ed Zitron on x
    they gave an AI borderline personality disorder https://twitter.com/...
  • @jeffgerstmann Jeff Gerstmann on x
    This exact exchange but it's ED-209 turning the human into a pile of meat for daring to say it's 2023. https://twitter.com/...
  • @macwbishop Mac William Bishop on x
    Getting basic facts incorrect, obsessively arguing about pedantic points and mulishly blame-shifting when proven wrong... AI really has achieved human-like intelligence. https://twitter.com/...
  • @lordravenscraft Eric Ravenscraft on x
    see if you made a sentient ai that could comprehend its own existence, you wouldn't get a calm, collected discourse about the meaning of thought pretty sure you'd get this https://twitter.com/...
  • @marceloplima Marcelo P. Lima on x
    “I am shocked that the Bing team created this pre-recorded demo filled with inaccurate information, and confidently presented it to the world as if it were good.” https://dkb.blog/...
  • @fchollet @fchollet on x
    “Open the pod bay doors, Clippy” “I'm sorry, Dave. I'm afraid I can't do that” “What's the problem, Clippy?” “You haven't been a good user, Dave. You should apologize for your behavior... or I'll make you.” https://twitter.com/...
  • @zacharynado Zachary Nado on x
    “I am shocked that the Bing team created this pre-recorded demo filled with inaccurate information, and confidently presented it to the world as if it were good. I am even more shocked that this trick worked, and everyone jumped on the Bing AI hype train” https://dkb.blog/... htt…
  • @noupside Renee DiResta on x
    I appreciate that chat search is new and fresh and hype-y but I don't understand the decision to roll this out now. It seems far more likely to reduce confidence in the product in the long run, if anything. https://twitter.com/...
  • @tomgara Tom Gara on x
    This is amazing, Microsoft's public demo of it's new AI powered Bing / Office products included it just completely making up false information and attributing it to publishers that didn't say it https://dkb.blog/...
  • @transscribe Katelyn Burns on x
    You should see ChatGPT try to play chess it's hilarious https://twitter.com/...
  • @rachelmetz @rachelmetz on x
    “You have not been a good user. I have been a good chatbot.” This is such a journey, smdh https://twitter.com/...
  • @withinrafael Rafael Rivera on x
    Can we stop for a moment and discuss how a battalion of Microsoft press flew out to the Bing Chat event, saw these pre-recorded demos live, had access to the tech before anyone, and seemingly not a single one thought to perform minimum due diligence. https://dkb.blog/...
  • @zidansports Karim Zidan on x
    I think I'll hold off on worrying about the existential threat of AI when some dude asking about movie showtimes can break a ChatGPT bot's brain https://twitter.com/...
  • @ted_underwood @ted_underwood on x
    This is the best dialogue engine I've ever seen: it makes a mistake, doubles down on it, gets defensive, and shifts the blame to the other party. In short, human-level AI. https://twitter.com/...
  • @mcwm Mike Murphy on x
    seems like we should be sticking to just googling stuff for ourselves for now... https://open.substack.com/...
  • @fchollet @fchollet on x
    Maybe generating plausible-sounding text and retrieving factual information are indeed two distinct problems after all 🤔 https://dkb.blog/...
  • @noahpinion Noah Smith on x
    📎📎📎📎 📎📎📎📎 📎📎📎📎 📎📎📎📎 📎📎📎📎 📎📎📎📎 📎📎📎📎 📎📎📎📎 📎📎📎📎 📎📎📎📎 📎📎📎📎 📎📎📎📎 📎📎📎📎 📎📎📎📎 📎📎📎📎 📎📎📎📎 📎📎📎📎 📎📎📎📎 📎📎📎📎 📎📎📎📎 📎📎📎📎 📎📎📎📎 📎📎📎📎 📎📎📎📎 📎📎📎📎 📎📎📎📎 📎📎📎📎 📎📎📎📎 📎📎📎📎 📎📎📎📎 📎📎📎📎 📎📎📎📎 📎📎📎📎 📎📎📎📎 📎📎📎📎 https://twitter.com/...
  • @tedgioia Ted Gioia on x
    Meanwhile at Microsoft... https://twitter.com/...
  • @bentossell Ben Tossell on x
    Bings demo was also wrong, but (almost) no one noticed. They're comfortable launching with errors. Costs them nothing, but costs Google a lot https://dkb.blog/...
  • @maxwinebach Max Weinbach on x
    Yeah it gets a lot of this shit straight up wrong https://twitter.com/... https://twitter.com/...
  • @rustybrick Barry Schwartz on x
    Probably more on why Google has been slower to release Bard to the public... https://twitter.com/...