OpenAI unveils the Apps SDK in preview, built on MCP, to let developers build ChatGPT apps, and says that it will begin accepting app submissions later in 2025
Expedia, Spotify, Canva, Zillow, Others Now Available Tech Xplore : OpenAI unveils ChatGPT app integration feature OpenAI : App developer guidelines Zillow MediaRoom : Zillow debuts the only real esta...
Alibaba debuts the Qwen3-Coder model for agentic coding, including a 480B-parameter MoE variant, and open sources Qwen Code, a CLI tool adapted from Gemini CLI
Qwen 39.4k — Text Generation Transformers Safetensors qwen3_moe conversational Coco Feng / South China Morning Post : Alibaba upgrades flagship Qwen3 model to outperform OpenAI, DeepSeek in maths, c...
Meta VP of Generative AI Ahmad Al-Dahle denies a rumor that the company trained Llama 4 Maverick and Scout on test sets, saying that Meta “would never do that”
but the EU doesn't get everything Pascale Davies / Euronews : From a political shift to a more powerful AI: Everything to know about Meta's Llama 4 models Jay Bonggolto / Android Central : Meta is com...
Alibaba releases 32.5B-parameter QwQ-32B-Preview under Apache 2.0 and claims the “reasoning” AI model beats OpenAI's o1-preview on the AIME and MATH tests
Introduction QwQ-32B-Preview is an experimental research model developed … Ananya Gairola / Benzinga : Alibaba's New AI Model Outperforms OpenAI's o1 In Specific Benchmarks, Now Available For Free Dow...
Meta debuts “quantized” versions of Llama 3.2 1B and 3B models, designed to run on low-powered devices and developed in collaboration with Qualcomm and MediaTek
so today we're releasing new quantized versions of Llama 3.2 1B & 3B that deliver up to 2-4x increases in inference speed and, on average, 56% reduction in model size, and 41% reduction in memory foot...
A study by Meta researchers suggests that training LLMs to predict multiple tokens at once, instead of just the next token, results in better and faster models
LLM approach to predict multiple tokens KAN: Kolmogorov-Arnold Networks —"promising alternatives to Multi-Layer Perceptrons" [image] Ethan / @ethan_smith_20 : it was only briefly touched upon, but is ...