Xiaomi open sources MiMo-V2.5 and MiMo-V2.5-Pro under the MIT License, saying both models are among the most efficient available for agentic “claw” tasks
Xiaomi, the Chinese firm best known for its smartphones and electric vehicles, has lately been shipping some incredibly affordable …
VentureBeat Carl Franzen
Related Coverage
- Xiaomi MiMo-V2.5-Pro — Today, MiMo-V2.5-Pro enters public beta. Xiaomi MiMo Team
- Introducing MiMo-V2.5 — Today, we are releasing MiMo-V2.5, a major step forward … Xiaomi MiMo, Explore and Love
- Model card Files and versions xet Hugging Face
Discussion
-
@clementdelangue
Clem
on x
This is how it's done! Who else should we ask to release weights? [image]
-
@artificialanlys
@artificialanlys
on x
Xiaomi's MiMo V2.5 Pro has landed at 54 in the Artificial Analysis Intelligence Index, tied with Moonshot's Kimi K2.6 - the current top open weights model. MiMo V2.5 Pro's weights are expected to be released soon, which would make MiMo V2.5 Pro the first equal open weights model …
-
@_luofuli
Fuli Luo
on x
Just dropped two open-source models: MiMo-V2.5-Pro (Code Agent, 1T total) and MiMo-V2.5 (Multimodal Agent, 310B total). Oh and one more thing — we're giving devs & creators 100T tokens on us. Go build something cool 🛠️ 🎁 100T Free Token Grant for Builders
-
@xiaomimimo
@xiaomimimo
on x
SGLang and vLLM support for the MiMo-V2.5 series is here. 🙌 Huge thanks to SGLang project from @lmsysorg and @vllm_project for moving fast and helping developers get started with MiMo-V2.5 on day zero. [image]
-
@stochasticchasm
@stochasticchasm
on x
interesting that the small model got 48T but the big one only got 27T. i guess a lot of that is probably just multimodal. [image]
-
@eliebakouch
Elie
on x
xiaomi mimo v2.5 eval card, pro is 1T total 42B active, omni (video/image/audio) is 310B total 15B active, both have 1M context support they train in FP8, 27T tokens for pro and 48T for the smaller variant. interleaved SWA with an aggressive 6:1 ratio and 128 window size, still […
-
@theahmadosman
Ahmad
on x
New Opensource SoTA contender enters the arena Xiaomi MiMo-V2.5 Pro - 1.02T Total Params / 42B Active Params - Base and Instruct versions Xiaomi MiMo-V2.5 - 310B Total Params / 15B Active Params - Base and Instruct versions MIT License Opensource AI just keeps getting better [ima…
-
@lmsysorg
@lmsysorg
on x
🎉 MiMo-V2.5 series is here, day-0 support is now live in SGLang! Two models to try: 1️⃣ MiMo-V2.5-Pro: 1.02T/42B MoE, hybrid attention, up to 1M context 2️⃣ MiMo-V2.5: full multimodal (text, image, video, audio), 310B/15B MoE, 1M context We also have day 0 support for this model …
-
@xiaomimimo
@xiaomimimo
on x
MiMo-V2.5 achieved Day-0 adaptation across multiple chip platforms on the first day of open source release. Huge thanks to our hardware ecosystem partners for helping make MiMo-V2.5 easier to deploy and run efficiently across more environments: @awscloud、@AMD [image]
-
@teortaxestex
@teortaxestex
on x
...It wasn't an intern's joke MiMo 2.5 (not Pro): > Trained on a total of ~48T tokens using FP8 mixed precision. The context window supports up to 1M tokens. We've got another 1M class, and the largest disclosed pretrain. Congrats Xiaomi. [image]