Ai2 launches Open Coding Agents, starting with SERA, an open-source family that includes 32B and 8B parameter models designed to adapt to private codebases
Ai2 launches Open Coding Agents, starting with SERA, an open-source family that includes 32B and 8B parameter models designed to adapt to private codebases
Artificial intelligence is moving swiftly, changing how developers craft, as code flows ever faster into repositories such as GitHub …
An Ai2 research scientist says AGI may never emerge because such a concept ignores the physical realities and limits of computation, such as energy constraints
If you are reading this, you probably have strong opinions about AGI, superintelligence, and the future of AI. X: @scaling01 , @sriramk , @tim_dettmers , and @tim_dettmers LinkedIn...
An Ai2 research scientist says AGI may never emerge because such a concept ignores the physical realities and limits of computation, such as energy constraints
If you are reading this, you probably have strong opinions about AGI, superintelligence, and the future of AI. X: @scaling01 , @sriramk , @tim_dettmers , and @tim_dettmers LinkedIn...
Chinese startup Moonshot releases Kimi K2 Thinking, an open-weight model it claims beats GPT-5 in agentic capabilities; source: the model cost $4.6M to train
Chinese startup Moonshot on Thursday released its latest generative artificial intelligence model which claims to beat OpenAI's ChatGPT in …
The Allen Institute for AI releases Tulu 3 405B, an open source model that it claims outperforms DeepSeek V3 and OpenAI's GPT-4o on certain benchmarks
Move over, DeepSeek. There's a new AI champion in town — and they're American. — On Thursday, Ai2, a nonprofit AI research institute based …
DeepSeek releases DeepSeek-V3, an open-source MoE model of 671B total parameters, with 37B activated per token, claiming it outperforms top models like GPT-4o
Chinese AI startup DeepSeek, known for challenging leading AI vendors with its innovative open-source technologies, today released a new ultra-large model: DeepSeek-V3.
The Allen Institute for AI debuts Multimodal Open Language Model in 1B- to 72B-parameter sizes, the most capable open-source AI model with visual abilities yet
A compact and fully open source visual AI model will make it easier for AI to take control of your computer—hopefully in a good way.
HyperWrite CEO unveils Reflection 70B, based on Llama 3.1 70B Instruct and trained using reflection-tuning, and says it beats GPT-4o in all benchmarks tested
There's a new king in town: Matt Shumer, co-founder and CEO of AI writing startup HyperWrite, today unveiled Reflection 70B …