The Open-Source Tipping Point: INTELLECT-3 Proves 100B+ MoE Models Can Outperform Corporate Giants
Prime Intellect releases a 100B+ parameter Mixture-of-Experts model that beats larger frontier models in reasoning, math, and coding, and they’re giving away the entire training recipe.