BANANDRE
NO ONE CARES ABOUT CODE

Navigation

HomeCategories

Categories

Artificial Intelligence(201)
Software Architecture(76)
Software Development(65)
Data Engineering(29)
Engineering Management(21)
Product Management(20)
Enterprise Architecture(8)
← Back to all tags

Tagged with

#moe-models

2 articles found

Trillion-Parameter AI on Your Desktop: The Kimi K2 Thinking Revolution Hits Local Hardware
kimi-k2
Featured

Trillion-Parameter AI on Your Desktop: The Kimi K2 Thinking Revolution Hits Local Hardware

Moonshot AI’s trillion-parameter reasoning model achieves unprecedented 30+ tokens/sec performance on consumer hardware through real-time GPU/CPU orchestration

#kimi-k2#local-inference#machine-learning...
Read More
Pruning MoE Models: The Art of Cutting Complexity Without Losing Brains
cerebras

Pruning MoE Models: The Art of Cutting Complexity Without Losing Brains

Cerebras releases REAP-pruned GLM-4.6 variants at 25%, 30%, and 40% sparsity with FP8 quantization – but do they actually work?

#cerebras#fp8#llm-compression...
Read More
BANANDRE
NO ONE CARES ABOUT CODE

Connect

2026 BANANDRE
Privacy PolicyTermsImpressum
Built with 🍌