Arcee's new, open source Trinity-Large-Thinking is the rare, powerful U.S.-made AI model that enterprises can download and customize
Summary
US startup Arcee AI has released Trinity-Large-Thinking, a 399-billion parameter open-source reasoning model distributed under the Apache 2.0 license.
Key Points
- Arcee AI (30 employees) released Trinity-Large-Thinking, trained for 33 days using 2,048 NVIDIA B300 Blackwell GPUs, under the Apache 2.0 license
- Adopted a Mixture-of-Experts (MoE) architecture where only 1.56% (approx. 13 billion) of the total 399 billion parameters are active per token
- Inference speed is 2-3 times faster than equivalent models, allowing full enterprise customization and commercial use
- Positions itself as an 'American-made open weight' strategy as an alternative to Chinese open-source models
- A bold bet investing about $20 million in training costs (half of the total funding) into a single training run
Notable Quotes & Details
Notable Data / Quotes
- Only 13B (1.56%) out of 399B total parameters are activated per token
- Invested $20M in training — about half of the company's total funding (under $50M)
- Hugging Face CEO Clément Delangue: "America's strength has always been startups. Arcee shows that it is possible"
Intended Audience
AI researchers, enterprise developers, AI policy stakeholders