Nvidia unveils the GH200 Grace Hopper Superchip, a combination GPU and CPU relying on high-bandwidth memory 3, or HBM3e, expected to enter production in Q2 2024
Bloomberg Ian King
Related Coverage
- NVIDIA Unveils Next-Generation GH200 Grace Hopper Superchip Platform for Era of Accelerated Computing and Generative AI Nvidia Newsroom
- Nvidia Grace Hopper GH200 With HBM3e Unveiled at Siggraph ExtremeTech
- Nvidia announces AI Workbench, which lets users create, test, and customize LLMs from Hugging Face and others on local workstations before using cloud resources TechCrunch
- Nvidia drops new AI chip expected to cut development costs Cointelegraph
- NVIDIA unveils new GH200 Grace Hopper Superchip with the world's first HBM3e processor for AI TweakTown
- NVIDIA GH200 superchip will soon reinvent AI gHacks Technology News
- Nvidia gives Grace Hopper superchip an HBM3e upgrade - sometime next year The Register
- Nvidia Teases New Hardware, Software, and Services for AI Decrypt
- Web crawlers and precision data sets Supervised
- Nvidia Reveals GH200 Grace Hopper GPU With 141GB of HBM3e Tom's Hardware
- NVIDIA GH200 Grace Hopper Superchip Booster for Generative AI Unveiled at Siggraph 2023 TechEBlog
- NVIDIA reveals its next-gen GH200 Grace Hopper AI chip with faster HBM3e memory Neowin
- Nvidia boosts its ‘superchip’ Grace-Hopper with faster memory for AI ZDNet
- NVIDIA Unveils Superchip Designed to Boost AI Capacity and Speed PYMNTS.com
- Nvidia set to hop AI forward with next-gen Grace Hopper Superchip VentureBeat
- NVIDIA Unveils Updated GH200 ‘Grace Hopper’ Superchip with HBM3e Memory, Shipping in Q2'2024 AnandTech
- NVIDIA Unveils GH200 Grace Hopper Superchip for Advanced AI Computing iPhone in Canada Blog
- NVIDIA Infuses Grace Hopper Superchip With HBM3e To Supercharge AI Data Center Workloads HotHardware
- Nvidia reveals new A.I. chip, says costs of running LLMs will ‘drop significantly’ CNBC
- NVIDIA Announces Grace-Hopper GH200 Superchip With 282GB of HBM3e Enabling 10TB/s of Combined Bandwidth & 3x Faster Memory Bandwidth Appuals.com
- Next-Gen NVIDIA GH200 Grace Hopper Platform with HBM3e Announced Guru3D.com
- NVIDIA GH200 GPU Boosted With World's Fastest HBM3e Memory, Delivers 5 TB/s Bandwidth Wccftech
- Nvidia launches new AI chip configuration Reuters
- Nvidia Unveils Faster Chip Aimed at Cementing AI Dominance Slashdot
Discussion
-
@timsweeneyepic
Tim Sweeney
on x
NVIDIA's architecture code names are catching up to our times, from Kepler (1571-1630) now to Grace Hopper (1906-1992) - one of the first programmers, working on the Harvard Mark I during World War II.
-
@beth_kindig
Beth Kindig
on x
Nvidia $NVDA just announced a new AI chip configuration, the Grace Hopper Superchip (GH200), which tied together Nvidia's H100 chip with an Nvidia central processor. The GH200, expected to speed up generative AI applications like ChatGPT $MSFT, is expected to be available in Q2..…
-
@tomstokes
Tom Stokes
on x
.@nvidia 's new module pairs a 72-core ARM Neoverse V2 CPU with their H100 GPU on a single module CPU and GPU are connected with 900GB/s NVLink interconnect. 7X faster than the normal PCIe Gen5 x16 link. This close integration has more benefits beyond raw bandwidth [image]
-
@ripster47
@ripster47
on x
🚦 $NVDA Unveils Next-Generation GH200 Grace Hopper Superchip Platform for Era of Accelerated Computing Generative AIWorld's First HBM3e Processor Offers Groundbreaking Memory, Bandwidth; Ability to Connect Multiple GPUs for Exceptional Performance; Easily Scalable Server Design
-
@anshelsag
Anshel Sag
on x
Jensen shows off a full GH200 supercomputer at size (ish) and says that yes, it probably will run @Crysis #SIGGRAPH2023 [image]
-
@tomstokes
Tom Stokes
on x
An additional NVLink connection allows pairing up with another GH200 CPU+GPU combo, giving even further easy access to memory on peer modules in the same way [image]
-
@tomstokes
Tom Stokes
on x
With a normal x86 CPU attached to a GPU via PCIe, the GPU and CPU have separate memory page tables (1st picture), requiring extra steps to share memory. In the GH200, the CPU and GPU, processes share a combined page table between CPU and GPU memory (2nd picture) [image]
-
@iancutress
@iancutress
on x
Why are people saying Jensen is announcing the GH200? I thought that's what Computex was. I've had a tab open with the whitepaper with the tech details ever since.
-
@anshelsag
Anshel Sag
on x
.@nvidia CEO Jensen Huang announces the GH200, a 72-Core CPU combined with a 4 PFlop Hopper GPU with a whopping 141 GB of HBM3e, 5TB/s of memory bandwidth #SIGGRAPH2023 [image]
-
r/LocalLLaMA
r
on reddit
NVIDIA Unveils Next-Generation GH200 Grace Hopper Superchip