Amazon VP of EC2 Dave Brown says AWS is considering using AMD's new MI300 chips and declined to use Nvidia's DGX Cloud chips; Oracle is Nvidia's first partner
Amazon Web Services (AMZN.O), the world's largest cloud computing provider, is considering using new artificial intelligence chips …
Reuters Stephen Nellis
Related Coverage
- AMD's New Chip Fails to Impress Investors. It's Still an AI Winner, Analyst Says. Barron's Online · Callum Keown
- AMD reveals new AI chip challenging Nvidia's dominance Cointelegraph · Amaka Nwaokocha
- With no big customers named, AMD's AI chip challenge to Nvidia remains uphill fight Reuters
- Advanced Micro Devices has outlined its new AI chips as it beefs up fight with Nvidia Silicon Valley Business Journal · Max A. Cherney
- Cloud Division Of Amazon is Investing in AMD's New AI Chips Analytics Insight
- AMD Gives Peek at Upcoming Line of AI Processors in Challenge to Rival Nvidia Bloomberg · Ian King
- A closer look at AMD Zen 4c-powered Bergamo Epyc Club386 · Tarinder Sandhu
- AMD Expands 4th Gen EPYC CPU Portfolio with Leadership Processors for Cloud Native and Technical Computing Workloads Advanced Micro Devices, Inc.
- AMD EPYC Bergamo CPUs With 128 Zen 4C Cores Guru3D.com · Hilbert Hagedoorn
- AMD Unleashes EPYC Bergamo And Genoa-X Data Center CPUs, AI-Ready Instinct MI300X GPUs HotHardware · Chris Goetting
- AMD Cloud-Native EPYC Bergamo, HPC Optimized Genoa-X CPUs And MI300 AI Accelerators Take Flight Forbes · Dave Altavilla
- AMD Details EPYC Bergamo CPUs With 128 Zen 4C Cores, Available Now Tom's Hardware · Paul Alcorn
- AMD Intros EPYC 97x4 “Bergamo” CPUs: 128 Zen 4c CPU Cores For Servers, Shipping Now AnandTech · Ryan Smith
- AMD announces the Ryzen Pro 7000 for desktop and Ryzen Pro 7040HS and 7040U for mobile, adding an integrated Ryzen AI block on select models and Zen 4 cores AnandTech · Gavin Bonshor
- The Third Time Charm Of AMD's Instinct GPU The Next Platform · Timothy Prickett Morgan
- AMD Expands Leadership Data Center Portfolio with New EPYC CPUs and Shares Details on Next-Generation AMD Instinct Accelerator and Software Enablement for Generative AI Advanced Micro Devices, Inc.
- AMD takes on Nvidia with MI300X AI GPU - could it land AWS as a client? Tech Monitor · Matthew Gooding
- AMD says Meta is using its cloud chip as it rolls out AI strategy update Reuters
- AMD Instinct MI300X accelerator for AI revealed, combines CPU, GPU, and 192 GB of HBM3 memory TweakTown · Kosta Andreadis
- AMD launches new Instict MI300X AI chip to take on Nvidia's GPUs SiliconANGLE · Mike Wheatley
- AMD Eyes AI, Cloud Expansion With Instinct MI300X, EPYC 97X4 Chips CRN · Dylan Martin
- AMD unveils MI300x AI chip as ‘generative AI accelerator’ ZDNet · Tiernan Ray
- AMD Expands MI300 With GPU-Only Model, Eight-GPU Platform with 1.5TB of HBM3 Tom's Hardware · Paul Alcorn
- AMD makes its case for generative AI workloads vs. Nvidia Constellation Research · Larry Dignan
- AMD Instinct MI300 is THE Chance to Chip into NVIDIA AI Share ServeTheHome · Patrick Kennedy
- AMD Expands AI Product Lineup with GPU-Only Instinct Mi300X with 192GB Memory Hacker News
Discussion
-
r/AMD_Stock
r
on reddit
Exclusive: Amazon's cloud unit is considering AMD's new AI chips
-
@amd
@amd
on x
Do you use Instagram, Facebook or WhatsApp? @Meta VP @AlexisBjorlin shared how all your favorite apps are powered by hundreds of thousands of AMD servers, with the next generation of #EPYC beginning deployment this year. So go on and share that meme. #EPYC has you covered. [image…
-
@dylanonchips
@dylanonchips
on x
Here it is: fourth-gen EPYC Bergamo for cloud-native applications. Up to 128 Zen 4c cores, each of which are 35% smaller than Zen 4 cores. Aimed at leadership performance-per-watt versus the performance-per-core focus of general-purpose EPYC Genoa. [image]
-
@iancutress
@iancutress
on x
I'm the only one that has brought their Genoa delidded CPU with them. Oh well :) Here's @AMD #Bergamo. Left: 96 core (12*8) Genoa Right: 128-core (16*8) Bergamo. Same IO die. Same uArch, resized L3. Same power. Same socket. Same DRAM Support. Higher Efficiency. [image]
-
@bobodtech
Bob O'Donnell
on x
Interesting discussion with @AMD's @LisaSu and @Meta regarding their planned use of Bergamo for #GenerativeAI apps and early tests showing a 2.5x improvement over Genoa. [image]
-
@dylanonchips
@dylanonchips
on x
AMD revealed the GPU-only Instinct MI300X, which has 192GB of HBM3 memory, beating the 80GB capacity of Nvidia's H100. This will allow users to run large language models on fewer GPUs, giving the MI300X leadership total cost of ownership, Su said. [image]
-
@dylanonchips
@dylanonchips
on x
Brad McCredie, AMD's vice president of data center and accelerated processing, explains the building blocks of the Instinct MI300X GPU in response to a question from @CDemerjian. [video]
-
@sasamarinkovic
Sasa Marinkovic
on x
MI300X - a powerhouse for processing the largest and most complex of LLMs * 192GB * https://www.anandtech.com/...
-
@ryansmithat
Ryan Smith
on x
While the chiplet-based design of the MI300 always left the door open to further CPU/GPU mix & matching, I wasn't sure if AMD was going to do it. But sure enough, they are, with a GPU-only MI300X. The 8 HBM stacks is going to be a big boon for the LLM/GenAI market they're after h…
-
@phatal187
@phatal187
on x
What are the odds Nvidia will be pushing out a 144GB H100 in Q4? 🤔 https://twitter.com/...
-
@anandtech
@anandtech
on x
And finally, AMD will be producing a super-sized, GPU-only version of the MI300 accelerator, the MI300X. Using 24GB HBM3 stacks and 8 CDNA 3 GPU chiplets, the flagship chip will offer 192GB of local memory, ideal for AI/LLM training and other AI workloads https://www.anandtech.co…
-
r/LocalLLaMA
r
on reddit
AMD Expands AI/HPC Product Lineup With Flagship GPU-only Instinct Mi300X with 192GB Memory