OpenAI says GPT-5.3-Codex-Spark is its first AI model that runs on Cerebras chips, after they signed a $10B+ deal in January; Codex has 1M+ weekly active users
at 1,000 tokens/s. [video]@openaidevs:Introducing GPT-5.3-Codex-Spark, our ultra-fast model purpose built for real-time coding. We're rolling it out as a research preview for ChatGPT Pro users in the ...
GPT-5.3-Codex-Spark is OpenAI's first AI model to run on chips from Nvidia rival Cerebras; OpenAI says Codex has more than 1M weekly active users
OpenAI is releasing its first artificial intelligence model that runs on chips from semiconductor startup Cerebras Systems Inc. …
Cognition releases SWE-1.5, a new coding model in Windsurf, saying it partnered with Cerebras to serve SWE-1.5 at speeds up to 13x faster than Claude Sonnet 4.5
lots of new paradigms/UX to figure out. @cognition : Today we're releasing SWE-1.5, our fast agent model. It achieves near-SOTA coding performance while setting a new standard for speed. Now available...
Cerebras announces the $50/month Code Pro and the $200/month Code Max plans, offering users access to Qwen3-Coder at speeds of up to 2,000 tokens per second
Two interesting examples of inference speed as a flagship feature of LLM services today. Bluesky: Tim Kellogg / @timkellogg.me : Cerebras Code — use models hosted on Cerebras with a dev-friendly subsc...
Mistral partnered with Cerebras to help its Le Chat app respond to user questions at 1,000 words per second, making Le Chat the world's fastest AI assistant
Cerebras Systems, an artificial intelligence chip firm backed by UAE tech conglomerate G42, said on Thursday it has partnered …
Cerebras launches the first of three supercomputers for Abu Dhabi-based G42 as part of its Condor Galaxy network; G42 trained an Arabic GPT model in April 2023
The new supercomputer, made by the Silicon Valley start-up Cerebras, was unveiled as the A.I. boom drives demand for chips and computing power.
Cerebras open sources seven GPT-based LLMs, ranging from 111M to 13B parameters and trained using its Andromeda supercomputer for AI, on GitHub and Hugging Face
Artificial intelligence chipmaker Cerebras Systems Inc. today announced it has trained and now released seven GPT-based large language models …
Cerebras announces its Andromeda supercomputer, combining 16 wafer-sized WSE-2 chips to create 13.5M AI-optimized cores, available to customers and researchers
Paul Alcorn / Tom's Hardware :
Cerebras says its “wafer-scale” chip set a record for the largest natural language processing AI model trained on a single device, at up to 20B parameters
Democratizing large AI Models without HPC scaling requirements. — Cerebras, the company behind the world's largest accelerator chip …
Cerebras Systems announces CS-1, an AI compute system powered by the world's largest chip; Argonne National Lab is using the first systems for basic research
Cerebras' world's largest chip takes compute to a whole new level. — Cerebras Systems' announced its new CS-1 system here at Supercomputing 2019.