/
Navigation
Chronicles
Browse all articles
Explore
Semantic exploration
Research
Entity momentum
Nexus
Correlations & relationships
Story Arc
Topic evolution
Drift Map
Semantic trajectory animation
Posts
Analysis & commentary
Pulse API
Tech news intelligence API
Browse
Entities
Companies, people, products, technologies
Domains
Browse by publication source
Handles
Browse by social media handle
Detection
Concept Search
Semantic similarity search
High Impact Stories
Top coverage by position
Sentiment Analysis
Positive/negative coverage
Anomaly Detection
Unusual coverage patterns
Analysis
Rivalry Report
Compare two entities head-to-head
Semantic Pivots
Narrative discontinuities
Crisis Response
Event recovery patterns
Connected
Search: /
Command: ⌘K
Embeddings: large
TEXXR

Chronicles

The story behind the story

days · browse · Enter similar · o open

Nvidia announces Blackwell, a new generation of AI chips available later in 2024, starting with the GB200 superchip, which pairs two B200 GPUs with a Grace CPU

- Nvidia on Monday announced a new generation of artificial intelligence chips and software for running AI models.

CNBC Kif Leswing

Discussion

  • @hellodavidryan David on threads
    As a quantum computing insider I can say one thing clearly.  Nvidia will be more impactful than IBM in quantum computing.  The entire industry relies on simulation and emulation, and even QPUs rely on GPUs in the overall system.  Plus Nvidia partners with all of us.  As soon as i…
  • @jjackyliang Jacky on threads
    Hello NVIDIA is here to change the world again
  • @laurengoode Lauren Goode on threads
    “Most advanced GPU in production in the world” (Nvidia Blackwell).  No word yet on how long companies will be waiting to get their hands on them 🙃
  • @laurengoode Lauren Goode on threads
    At GTC, Nvidia cofounder and CEO Jensen Huang has just officially revealed Blackwell, a “really really big GPU” (named after mathematician David Blackwell).  Huang confirmed Blackwell was next GPU in my interview with him early this year https://www.wired.com/...
  • @mikeisaac Rat King on x
    one thing i will say is i really wish i had started learning about hard tech (i.e. chips) long ago i used to build computers as a teen but now i dont know shit and have spent the past 18 months trying to catch up still, am enjoying it because it is v important and interesting
  • @jeffjarvis @jeffjarvis on x
    Uncomfortable moment in NVIDIA keynote: New, bigger chip, on the left, is named after David Blackwell. The older, smaller was named after Grace Hopper. Yes. But it's now superseded. CEO Huang says to Hopper: “Good girl.” He read the room. He should've read the rest of the world.
  • @firstadopter Tae Kim on x
    Jensen says Blackwell will be the “most successful product launch” in Nvidia's history. He showed a flex slide with dozens of Blackwell launch customers after noting they had only two cloud customers last time. So much for the silly H100 lead time falling bear narrative.
  • @mikeisaac Rat King on x
    Nvidia debuts “Blackwell” GPU architecture with a veritable who's who of tech ceo quotes — Amazon's Jassy is the longest and jargoniest while Tesla's Musk is the most succinct (and least self promotional of the bunch) Zuckerberg notes LLaMA models will be trained on blackwell [im…
  • @tunguz Bojan Tunguz on x
    Ladies and gentlemen: Blackwell is here. #GTC24 [image]
  • @ericjhonsa Eric Jhonsa on x
    Between the raw compute gains, memory specs, FP4/FP6, and new inference microservices, $NVDA has a good inference sales pitch for the B100. Also, it's notable the GB200 uses copper rather than optics for NVLink interconnects. Maybe some read-through here for optics firms. [image]
  • @quinnypig Corey Quinn on x
    All of those companies who bought H100s and cited a half-decade depreciation schedule are mighty quiet today.
  • @markhachman Mark Hachman on x
    Nvidia Blackwell bringup board and some enterprise specs #GTC [image]
  • @martyswant Marty Swant on x
    “Ladies and gentlemen, I'd like to introduce you to a very, very big GPU.” - @nvidia's CEO Jensen Huang right before debuting #Nvidia's new Blackwell platform today at #GTC24.
  • @anshelsag Anshel Sag on x
    .@nvidia's Blackwell customers. Announces @AWS, @GoogleCloudTech, @Azure and @Oracle as lead CSPs. #GTC24 [image]
  • @ctnzr Bryan Catanzaro on x
    Four years ago, we split GA100 into two halves that communicate through an interconnect. It was a big move - and yet barely anyone noticed, thanks to amazing work from CUDA and the GPU team. Today, that work comes to fruition with the Blackwell launch. Two dies. One awesome GPU. …
  • @firstadopter Tae Kim on x
    Turns out some of Mark Zuckerberg's H100 compute equivalent GPUs are Nvidia Blackwell GPUs [image]
  • @tomwarren Tom Warren on x
    Nvidia has just announced Blackwell, its next-generation GPUs for AI. The Nvidia GB200 Grace Blackwell Superchip offers up to 30 times performance increase for LLM inference workloads and can scale to AI models with up to 10 trillion parameters https://www.theverge.com/... [image…
  • @netcapgirl Sophie on x
    all of these companies will be up 10% tomorrow [image]
  • @firstadopter Tae Kim on x
    Supermicro “anticipates being first-to-market in deploying full rack clusters featuring NVIDIA Blackwell GPUs.” B100/B200/GB200 for all! https://www.supermicro.com/...
  • @benbajarin Ben Bajarin on x
    Oh Blackwell ONLY has 128b more transistors than Hopper lol.
  • @patrickmoorhead Patrick Moorhead on x
    Here is the @nvidia Blackwell beast: • 20 PF (FP4), 10 PF (FP8) • “4x training, 30x inf, 25x energy eff” vs Hopper • 2 max reticle dies linked by 10TB/s NVHBI • 192GB HBM3E (Hey @MicronTech, SK-Hynix, @SamsungDSGlobal) • New NVLink & NVLink switch (to feed the beasts intra... [im…
  • @drjimfan @drjimfan on x
    Blackwell, the new beast in town. > DGX Grace-Blackwell GB200: exceeding 1 Exaflop compute in a single rack. > Put numbers in perspective: the first DGX that Jensen delivered to OpenAI was 0.17 Petaflops. > GPT-4-1.8T parameters can finish training in 90 days on 2000 Blackwells..…
  • @nvidiagtc @nvidiagtc on x
    Just announced at #GTC24! New Blackwell platform arrives to fuel accelerated computing and #generativeAI with NVIDIA GB200 NVL72. https://www.nvidia.com/... [image]
  • @matdrewin Mathieu Drouin on x
    @Carnage4Life Not seeing Tim Apple in that list.
  • @amir Amir Efrati on x
    Jensen Huang to CEOs: Tell me how much you lov— Every CEO: Done. [image]
  • @migtissera Migel Tissera on x
    Goodness me. 1x GPU Blackwell - 192GB VRAM 2x GPU Blackwell with CPU - 384 GB VRAM Unbelievable. [image]
  • @carnage4life Dare Obasanjo on x
    Is there a bigger flex than Nvidia getting a quote from every big tech CEO about how awesome their new Blackwell GPU architecture is? [image]
  • @arm @arm on x
    📢 The NVIDIA GB200 Grace Blackwell Superchip has arrived! It combines the NVIDIA Grace CPU, with 72 Arm Neoverse V2 cores ➕ Two NVIDIA B200 Tensor Core GPUs to enable the AI era. Congrats to the team! 👏 https://nvidianews.nvidia.com/ ...
  • @jeffclune Jeff Clune on x
    Every time I mention (the spirit behind) Moore's law and that it will continue to radically alter the world, people quibble and love to tell me how Moore's law is dead. They miss the forest for the tree roots. The gist is still right, and world-changing.
  • @carnage4life Dare Obasanjo on x
    @matdrewin This is an indication of Apple's relevance in AI...
  • @aschilling Andreas Schilling on x
    Welcome Blackwell! #GTC24 - TSMC 4NP - 208 Billion Transistors - 2x 800+ mm^2 - 10 TB/s NV-HBI - 700 - 1,200 W TDP - 192 GB HBM3E - 8 TB/s Memory BW https://www.hardwareluxx.de/ ... [image]
  • @markhachman Mark Hachman on x
    Jensen shows off an NVLink chassis with 130TB/s, or the aggregate bandwidth of the Internet (?!)
  • @bindureddy Bindu Reddy on x
    Nvidia Blackwell Is The New AI Superchip!! 👏👏 GPT-4-1.8T parameters can finish training in 90 days on 2000 Blackwells!! FWIW, I am very skeptical about needing 1T parameter models in the future. - Model architectures and sizes are becoming smaller - AI super chips are... [image]
  • @mattmday Matt Day on x
    Sometimes you risk reading too much into press release politics, but also lol NVIDIA extracted glowing product testimonials from dudes who run companies with a combined $1.5 trillion in revenue.
  • @elonmusk Elon Musk on x
    @DrJimFan The rate of change of the ratio of digital to biological compute is hyper exponential
  • @conorsen Conor Sen on x
    He's the man so this is probably wrong but I just reflexively recoil when people give presentations with charts like this: [image]
  • @ryanshrout Ryan Shrout on x
    Jensen is KILLING it on stage at #GTC2024 today, announcing Blackwell and making his case for @nvidia continuing to lead this AI compute race. Laughing, smiling, informative; clearly a man who is confident. The hardware is unrivaled. And the fact that he's filling arenas for... […
  • @silvermanjacob Jacob Silverman on x
    Under semiconductor nationalism, social security would be backstopped by shares of Nvidia.
  • @thetranscript_ @thetranscript_ on x
    $NVDA CEO launches its new generation of AI graphics chips called Blackwell: “Hopper is fantastic, but we need bigger GPUs” [image]
  • @tomwarren Tom Warren on x
    Nvidia's AI GPUs are getting big. “When somebody says GPU, I see this. Two years ago when I saw a GPU it was the HGX. It was 70lbs, 35,000 parts. Our GPUs now are 600,000 parts and 3,000lbs. That's kind of like the weight of a carbon fiber Ferrari,” says Nvidia CEO Jensen Huang […
  • @abacaj Anton on x
    Jensen announcing mega GPUs while consumers stuck with 24GB and 48GB. Lol it's over for open source, the best models will always need the cloud
  • @martyswant Marty Swant on x
    Nvidia's new Blackwell #AI chip is named after David Blackwell, a renowned mathematician known for contributions to game theory, probability & statistics. Blackwell, who died in 2010 at age 91, was the first Black American elected to the National Academy of Sciences. #GTC24
  • @satyanadella Satya Nadella on x
    Today we're expanding our partnership with @nvidia, as we build on our commitment to ensure customers have the most comprehensive platforms and tools across the Copilot stack, from silicon to software, to build their own breakthrough AI capability. https://news.microsoft.com/...
  • @sytelus Shital Shah on x
    If you were to train GPT-4, 1.8T params model, On A100, it will take 25k A100s and take 3-5 months. On H100, it will take 8k GPUs and take ~3 months. On B100, it will take 2k GPUs and take ~ 3 months. - Jenson at GTC.
  • r/wallstreetbets r on reddit
    Nvidia CEO Jensen Huang announces new AI chips: ‘We need bigger GPUs’
  • r/artificial r on reddit
    One-Minute Daily AI News 3/18/2024
  • r/technology r on reddit
    Nvidia reveals Blackwell B200 GPU, the “world's most powerful chip” for AI
  • r/singularity r on reddit
    Nvidia's GB200 NVLink 2 server enables deployment of 27 trillion parameter AI models
  • r/technews r on reddit
    Nvidia reveals Blackwell B200 GPU, the “world's most powerful chip” for AI
  • r/teslainvestorsclub r on reddit
    Nvidia reveals Blackwell B200 GPU, the “world's most powerful chip” for AI
  • @drjimfan @drjimfan on x
    Today is the beginning of our moonshot to solve embodied AGI in the physical world. I'm so excited to announce Project GR00T, our new initiative to create a general-purpose foundation model for humanoid robot learning. The GR00T model will enable a robot to understand multimodal.…
  • @drjimfan @drjimfan on x
    2024 is the Year of Humanoid. There's no robot hardware more general-purpose. We are all in.
  • @cbdawson.bsky.social Cian on bluesky
    Hmm, my list of questions and concerns is a mile long.  Who designs, vets, and updates the models?  What data are used, and who gets to decide that?  And how do they convey the varying levels of uncertainty for different parameters? [embedded post]
  • @martinvars Martin Varsavsky on x
    This is the issue with climate change vs air pollution. Air pollution kills 7 million a year and climate change around 20k because we are so much better than a century ago in avoiding catastrophic weather than we were a century ago when many more used to die. But we are all...
  • @pitdesi Sheel Mohnot on x
    Among the things that AI makes better: Weather prediction Nvidia created a digital twin, called Earth 2, to simulate extreme weather events and calculate impacts. It's 10x more precise than existing predictions, should allow for mitigation of extreme weather impacts. [image]
  • @mattlanza Matt Lanza on x
    Remains to be seen if this is indeed 10x more precise *in practice*. There's a lot of selling of AI tools right now as companies jockey for positioning and hype. Every 1-2 weeks we are getting new tools. It will get increasingly difficult to quickly separate wheat from chaff.
  • r/climate r on reddit
    Nvidia announces Earth-2 digital twin to forecast planet's climate change