Paris-based Mistral AI raised €385M from a16z, Lightspeed, and others, sources say at a ~$2B valuation, and releases its open-source La plateforme model
The company has publicly released its latest technology so people can build their own chatbots. Rivals like OpenAI and Google argue that approach can be dangerous.
New York Times Cade Metz
Related Coverage
- Mixtral of experts — A high quality Sparse Mixture-of-Experts. Mistral AI
- La plateforme — Our first AI endpoints are available in early access. Mistral AI
- Mistral AI, a Paris-based OpenAI rival, closed its $415 million funding round TechCrunch · Romain Dillet
- Mistral AI Raises $415 Million in Funding, Repositioning EU in AI Development Race Metaverse Post · Alisa Davidson
- AI start-up Mistral secures €385m to challenge OpenAI Silicon Republic · Leigh Mc Gowran
- AI FOMO Is Alive and Well in Paris and Berlin Bloomberg · Lionel Laurent
- Generative AI startup Mistral AI secures $415 million in second funding round TechStartups · Nickie Louise
- French startup Mistral AI closes $415M funding round Cointelegraph · Savannah Fortis
- Mistral AI raises $415 million and launches API service The Decoder · Matthias Bastian
- OpenAI Competitor Mistral AI Secures $414 Mln In Funding, Google & Microsoft Overshadowed? CoinGape
- Open-source generative AI startup Mistral AI raises $415M in funding SiliconANGLE · Mike Wheatley
- French AI start-up Mistral AI raises 385 mn euros Tech Xplore
- Investing in Mistral — AI should be open. — Most of the core systems powering modern computing … Andreessen Horowitz · Dharris
- Mistral AI launches beta access to API endpoints Stack Diary · Alex Ivanovs
- Mistral AI Valued at $2 Billion Following Funding Round PYMNTS.com
- Mistral AI Achieves $2 Billion Valuation in Momentous Funding Round Propelling Europe's AI Landscape New.blicio.us
- Paris-based Startup and OpenAI Competitor Mistral AI Valued at $2 Billion Unite.AI · Alex McFarland
- Mistral: Our first AI endpoints are available in early access Hacker News
- Paris-Based Startup and OpenAI Competitor Mistral AI Valued at $2B Hacker News
Discussion
-
@MattHodges@mastodon.social
Matt Hodges
on mastodon
“Mixtral has 45B total parameters but only uses 12B parameters per token. It, therefore, processes input and generates output at the same speed and for the same cost as a 12B model.” — https://mistral.ai/...
-
@jcsamuelian
@jcsamuelian
on x
You can say you are a platform when you have 1m developers using it. and it's the case with Mistral
-
@eladgil
Elad Gil
on x
🔥 congrats!
-
@space_colonist
Martian
on x
seems mistral has an even larger model (medium lmao) than what they just released (mixtral/small) that's a fair bit better than GPT3.5 [image]
-
@dctanner
Damien C. Tanner
on x
Open source ChatGPT's moment has arrived: Mixtral of experts “matches or outperforms GPT3.5 on most standard benchmarks.” https://mistral.ai/...
-
@dchaplot
@dchaplot
on x
Proud to announce: Mixtral 8x7B — Mixtral of Experts - Free to use under Apache 2.0 license - outperforms Llama 2 70B with 6x faster inference. - matches or outperforms GPT3.5 - masters English, French, Italian, German and Spanish. - seq_len = 32K https://mistral.ai/... 1/N [imag…
-
@alexsablay
@alexsablay
on x
Our latest release @MistralAI Mixtral 8x7B mixture of experts - performance of a GPT3.5 - inference cost of a 12B model - context length of 32K - speaks English, French, Italian, German and Spanish Blog post https://mistral.ai/... [image]
-
@paulbz
Paul Murphy
on x
Not long ago, we announced @lightspeedvp led a seed investment round in @MistralAI. Today, we're thrilled to share we've increased our ownership in this incredible company as part of a new round of Series A funding led by @a16z with @generalcatalyst and a handful of others.
-
@dchaplot
@dchaplot
on x
Excited to release @MistralAI La Plateforme! Three chat endpoints with competitive pricing: Mistral-tiny: Mistral 7B Instruct v0.2, upgraded base model with higher context length 8K —> 32K and better finetuning, 6.84 —> 7.61 on MT Bench. Mistral-small: Mistral 8x7B Instruct... [i…
-
@_philschmid
Philipp Schmid
on x
We just got more details on Mixtral 8x7B from @MistralAI 🧠 Mixtral is sparse mixture of expert models (SMoE) with open weights outperforming existing open LLMs like Meta Llama 70B.🤯 💪🏻 TL;DR: ⬇️ [image]
-
@tianle_cai
@tianle_cai
on x
Exciting times with the new Mixtral model from @MistralAI! It's evident that they've fine-tuned the Mistral 7B model to an impressive 8x. The significant correlation between the weights of the two models is a testament to the successful reuse of models. This approach could... [im…
-
@omarsar0
Elvis
on x
Mistral's first AI endpoints are here! Things are about to get super interesting in the ecosystem. https://mistral.ai/... [image]
-
@alphasignalai
Lior
on x
Big. The @MistralAI API is out. “la plateforme” serves three chat endpoints for generating text following textual instructions and an embedding endpoint. : Mistral-tiny: Affordable, serves Mistral 7B Instruct v0.2, English only, scores 7.6 on MT-Bench. Mistral-small: New... [imag…
-
@art_zucker
Arthur Zucker
on x
🤗@MistralAI's new MOE model (Mixtral, what a nice name) is now supported in the latest release of transformers (make sure you have 4.36.0) 🥳🤗 [image]
-
@karpathy
Andrej Karpathy
on x
New open weights LLM from @MistralAI params.json: - hidden_dim / dim = 14336/4096 => 3.5X MLP expand - n_heads / n_kv_heads = 32/8 => 4X multiquery - “moe” => mixture of experts 8X top 2 👀 Likely related code: https://github.com/... Oddly absent: an over-rehearsed... [image]
-
@realgeorgehotz
George Hotz
on x
Google put out a press release and a fake demo. Mistral put out a torrent.
-
@levelsio
@levelsio
on x
Open source LLMs reaching GPT-4 levels way earlier than we thought is the most exciting thing now to me With open source RAGs (plugins for LLMs) we can even add stuff like GPT-4 Vision, web browsing and data analysis and reach features parity with GPT-4 and future versions This..…
-
@josephjacks_
@josephjacks_
on x
Incredible that @MistralAI just raised $400M~ on a $2B post-money valuation in this market with a purely open source product (so far), no revenue and < 30 employees. Well done!
-
@abacaj
Anton
on x
> be mistral > drop not one but two torrents (probably SOTA again) > nobody knows how to run the new model (yet) > proceed to raise another $400M same day [image]
-
@lulumeservey
Lulu Cheng Meservey
on x
Google vs. Mistral: a tale of two AI launches and a case study in knowing your audience Google announced their new model two days ago. It was named Gemini, and it wafted in with a blog post, brand guidelines, a press tour, and a polished sizzle reel that later turned out to be...…
-
@jxmnop
Jack Morris
on x
say what you will about mistral, tweeting exclusively download links to new models with no context is unbelievably cool [image]
-
@unsorsodicorda
Andrea Panizza
on x
TL;DR: 1) sparse MoE, running 2 experts/token (we all knew that 😂) 2) marginally better than GPT-3.5 3) there's a Mixtral Instruct 4) there's also a Mistral Medium, which is probably way bigger than Mixtral, and way better than GPT 3.5 https://mistral.ai/... so much for 1/n
-
@maccaw
Alex MacCaw
on x
Looks like we might see an open source GPT-4 level model much sooner than I thought. Get the Starling team fine tuning Mistral.
-
@nicoritschel
Nico Ritschel
on x
So there's a 3rd @MistralAI model that outperforms Friday's mixtral (and GPT-3.5). Also hosted inference! https://mistral.ai/... [image]
-
@abacaj
Anton
on x
After playing with this new mistral model for the last 24 hours, pretty sure if you go through fine tuning and rlhf you would get a > gpt-3.5 local model
-
@arthurmensch
Arthur Mensch
on x
Announcing Mixtral 8x7B https://mistral.ai/... and our early developer platform https://mistral.ai/.... Very proud of the team!
-
@basedbeffjezos
@basedbeffjezos
on x
OSS LLMs competing with big centralized co models is the future we want.
-
@vikhyatk
Vik
on x
Mistral's stated goal for this model (according to their pitch deck) was to beat ChatGPT 3.5 by a large margin. [image]