BANANDRE
NO ONE CARES ABOUT CODE

Navigation

HomeCategories

Categories

Artificial Intelligence(181)
Software Architecture(64)
Software Development(52)
Data Engineering(30)
Product Management(20)
Engineering Management(18)
Enterprise Architecture(6)
← Back to all tags

Tagged with

#mixture-of-experts

2 articles found

The Open-Source Tipping Point: INTELLECT-3 Proves 100B+ MoE Models Can Outperform Corporate Giants
large-language-models
Featured

The Open-Source Tipping Point: INTELLECT-3 Proves 100B+ MoE Models Can Outperform Corporate Giants

Prime Intellect releases a 100B+ parameter Mixture-of-Experts model that beats larger frontier models in reasoning, math, and coding, and they’re giving away the entire training recipe.

#large-language-models#mixture-of-experts#open-source...
Read More
MiniMax M2: The Open-Source Coding Agent That’s Actually Affordable
coding-agents

MiniMax M2: The Open-Source Coding Agent That’s Actually Affordable

MiniMax’s 229B-parameter MoE model delivers Claude-level coding performance at 8% the price, challenging the economics of agentic development.

#coding-agents#llm-efficiency#mixture-of-experts...
Read More
BANANDRE
NO ONE CARES ABOUT CODE

Connect

2025 BANANDRE
Built with 🍌