/
Navigation
Chronicles
Browse all articles
Explore
Semantic exploration
Research
Entity momentum
Nexus
Correlations & relationships
Story Arc
Topic evolution
Drift Map
Semantic trajectory animation
Posts
Analysis & commentary
Pulse API
Tech news intelligence API
Browse
Entities
Companies, people, products, technologies
Domains
Browse by publication source
Handles
Browse by social media handle
Detection
Concept Search
Semantic similarity search
High Impact Stories
Top coverage by position
Sentiment Analysis
Positive/negative coverage
Anomaly Detection
Unusual coverage patterns
Analysis
Rivalry Report
Compare two entities head-to-head
Semantic Pivots
Narrative discontinuities
Crisis Response
Event recovery patterns
Connected
Search: /
Command: ⌘K
Embeddings: large
TEXXR

Chronicles

The story behind the story

days · browse · Enter similar · o open

OpenAI strikes a multibillion-dollar, three-year deal to buy 750 MW of computing capacity from Cerebras, which Sam Altman has backed; sources: it's a $10B+ deal

The ChatGPT-maker is racing to secure more computing power, especially for responding to user queries

Wall Street Journal

Discussion

  • @cerebras @cerebras on x
    OpenAI🤝Cerebras https://openai.com/... [image]
  • @ericvishria Eric Vishria on x
    .@benchmark co-lead the initial round for Cerebras 10 years ago. Over the following 5 years, the team amazed, delivering the technological marvel of a wafer-scale chip, the system to heat and cool it, and more recently the software layer that allows giant fleets of Cerebras
  • @benbajarin Ben Bajarin on x
    This is something. Huge win for @cerebras. Also, firm indication, we don't have enough compute and filling that gap can come from many places! https://openai.com/...
  • @kakashiii111 @kakashiii111 on x
    Coincidence? [image]
  • @ericvishria Eric Vishria on x
    In tech, faster ultimately wins. And nothing is faster than Cerebras. Was only a matter of time... 10 years in this case 😂 https://openai.com/...
  • @draecomino James Wang on x
    ChatGPT is about to run a whole lot faster
  • @scaling01 @scaling01 on x
    Imagine 500 tokens/s GPT-5.5 [image]
  • @mweinbach Max Weinbach on x
    HUGE for @cerebras. OpenAI will be using them, starting in 2028, for inference capacity Can't wait for 750+ tok/s GPT models https://openai.com/...
  • @theo @theo on x
    Many years ago, OpenAI considered buying Cerebras. Elon was trying to push them to do it through Tesla. [image]
  • @yuchenj_uw Yuchen Jin on x
    OpenAI's multibillion-dollar partnership with Cerebras makes total sense.  Sam made the right call.  Cerebras chips are insanely fast at inference, sometimes 20x Nvidia GPUs, similar to Groq.  My biggest issue with ChatGPT and GPT-5.2 Thinking/Pro is latency.  Cerebras software s…
  • @matthewberman Matthew Berman on x
    Whoa! This is huge. I've always wondered why OpenAI didn't use Groq or Cerebras. They are SO fast. Now we know why Groq was bought by NVIDIA. Everything is moving to specialized chips. Revenue is made at inference. ChatGPT is about to be 100x faster. [image]
  • r/CerebrasSystems r on reddit
    OpenAI Forges Multibillion-Dollar Computing Partnership With Cerebras
  • r/hardware r on reddit
    OpenAI enters $10 billion partnership with Cerebras