US-based AI startup Arcee releases Trinity Large, a 400B-parameter open-weight model that it says compares to Meta's Llama 4 Maverick 400B on some benchmarks
Many in the industry think the winners of the AI model market have already been decided: Big Tech will own it (Google, Meta …
TechCrunch Julie Bort
Related Coverage
- Trinity Large — A deep dive into Trinity Large, covering architecture, sparsity … Arcee AI · Lucas Atkins
- 👨🔧 OpenAI's newest product lets you vibe your way through coding science Rohan's Bytes · Rohan Paul
- Arcee AI goes all-in on open models built in the U.S. Interconnects AI · Nathan Lambert
- Biggest moment in Arcee's history and I'm incredibly proud of this team. — We went all in on building a U.S. frontier open-weight lab … Mark McQuade
- Trinity large: An open 400B sparse MoE model Hacker News
Discussion
-
@arcee_ai
@arcee_ai
on x
Today, we're releasing the first weights from Trinity Large, our first frontier-scale model in the Trinity MoE family. [video]
-
@latkins
Lucas Atkins
on x
Today, we are releasing our first weights from Trinity-Large, our first frontier-scale model in the Trinity MoE family. American Made. - Trinity-Large-Preview (instruct) - Trinity-Large-Base (pretrain checkpoint) - Trinity-Large-TrueBase (10T pre Instruct data/anneal) [video]
-
@theonejvo
Jamieson O'Reilly
on x
In a twist that perfectly illustrates the threat landscape I've been writing about, Clawdbot's creator Peter had to rename the project's accounts due to alleged @AnthropicAI trademark issues, and during that transition window crypto scammers immediately snatched the old handles […
-
@scaling01
@scaling01
on x
American open-weight LLMs are back! Arcee AI trained Trinity Large Preview a 400B MoE model in just over 30 days on 2048 Nvidia B300 GPUs. It is much faster and more efficient than comparable chinese open-weights models like DeepSeek-V3 and GLM-4.7. Trinity Large is part of the […
-
@sasurobert
Robert Sasu
on x
Do not download skills from the internet. Write it yourself, manually. Take a little time to do it. Prompt injections, skills injections is super easy and can steal everything. Thousands of “devs” simply download without verifying. This will compromise everything.
-
@kalomaze
@kalomaze
on x
oh nothing too crazy. just a 400 billion parameter western MoE model pretrained from scratch on 15T+ tokens at an even more aggressive sparsity ratio than any other model of comparable scale
-
@beffjezos
@beffjezos
on x
American open source isn't dead. Kudos to the team
-
@natolambert
Nathan Lambert
on x
Here's my conversation with @latkins and the team at @arcee_ai on their path to training and releasing Trinity Large today. From going all in on open models built end to end in the US 6 months ago to having the model in hand is no easy feet. I loved this conversation on how to [v…
-
@juddrosenblatt
Judd Rosenblatt
on x
(early) Recursively self-improving AI systems are already here and the back door is open to anyone
-
@prietschka
Paul Rietschka
on bluesky
Going to note that it's pretty insane everyone is out there building and rebuilding the same transformer-based LLMs and calling it “innovation.” — It's not. — This is akin to everyone spending billions to reinvent, over and over, Coca-Cola. — “Look at my 400B-parameter spar…
-
@cailen
@cailen
on x
Since everyone on this site takes a victory lap, so will I. Proud @arcee_ai investor since seed.
-
r/LocalLLaMA
r
on reddit
Arcee AI releases Trinity Large : OpenWeight 400B-A13B