/
Navigation
Chronicles
Browse all articles
Explore
Semantic exploration
Research
Entity momentum
Nexus
Correlations & relationships
Story Arc
Topic evolution
Drift Map
Semantic trajectory animation
Posts
Analysis & commentary
Pulse API
Tech news intelligence API
Browse
Entities
Companies, people, products, technologies
Domains
Browse by publication source
Handles
Browse by social media handle
Detection
Concept Search
Semantic similarity search
High Impact Stories
Top coverage by position
Sentiment Analysis
Positive/negative coverage
Anomaly Detection
Unusual coverage patterns
Analysis
Rivalry Report
Compare two entities head-to-head
Semantic Pivots
Narrative discontinuities
Crisis Response
Event recovery patterns
Connected
Search: /
Command: ⌘K
Embeddings: large
TEXXR

Chronicles

The story behind the story

days · browse · Enter similar · o open

Meta unveils four new chips, the MTIA 300, MTIA 400, MTIA 450, and MTIA 500, set to launch by the end of 2027; the MTIA 300 is in production for content ranking

Meta Platforms Inc. plans to deploy four new generations of its in-house artificial intelligence chips by the end of 2027 …

Bloomberg

Discussion

  • @laurengoode Lauren Goode on x
    Meta is developing four new chips, part of its MTIA family of AI accelerators, and while they're still going to be used for ranking and recommendations within Meta apps, Meta says it has its eyes set on—you guessed it—inference https://www.wired.com/...
  • @theaustinlyons Austin Lyons on x
    Huge silicon roadmap announcement from $META. MTIA 300, 400, 450, 500. All optimized for inference. MTIA 300 for recommendations (money printer). MTIA 450, 500 for GenAI inference. Meta and Google have the cleanest ROIC story in custom silicon IMO. MTIA team made good [image]
  • @rihardjarc Rihard Jarc on x
    $META just dropped their custom ASIC MTIA roadmap (MTIA 300, 400, 450, and 500) One thing that clearly stands out, there is a lot of HBM in that roadmap. The number of different companies bidding for HBM capacity now means the 3 HBM providers have even more negotiating power. I […
  • @aiatmeta @aiatmeta on x
    Custom silicon is critical to scaling next-gen AI. We're detailing the evolution of the Meta Training and Inference Accelerator (MTIA), our homegrown silicon family designed to power the next era of AI experiences. Traditional chip cycles span years, but model architectures [imag…
  • @stocksavvyshay Shay Boloor on x
    $META plans to deploy four new generations of MTIA chips over the next two years showing it sees custom silicon as critical for scaling ranking, recommendations & GenAI workloads more efficiently. What stands out is that Meta is building for both ad-driven ranking & GenAI [image]
  • @danielnewmanuv Daniel Newman on x
    Remembering back to last week when people said $META was canceling its AI chip program. Glad some of us kept it real. 😬🙃
  • @metanewsroom @metanewsroom on x
    Our MTIA silicon remains central to our AI infrastructure strategy, with four new generations of MTIA chips forthcoming in the next two years to support ranking and recommendations on our apps, along with Gen AI workloads https://about.fb.com/...
  • @laurengoode Lauren Goode on x
    The most noteworthy thing about these chips might be the cadence with which Meta claims it's developing and shipping them. Co says models are advancing so fast that traditional chip production cycles won't cut it. It's giving chip engineers on adderall https://www.wired.com/...