Cerebras Systems claims its hardware can now run a neural network with 120 trillion parameters, targeting a nascent market for massive NLP AI algorithms
I can't stop writing about AI language models. The CEO of @CerebrasSystems says a cluster of his wafer-chips could run a 120 trillion parameter model (100x what we have today), and he claims OpenAI is planning this kind of scale for GPT-4. https://www.wired.com/...
Proud to unveil the first brain-scale #AI solution today @hotchipsorg 2021, made possible through our revolutionary Weight Streaming execution mode. Learn more about we will enable the extreme-scale models of the future: https://cerebras.net/... #machinelearning #GPT3 #NLP