Nvidia Says Blackwell Is "Transforming the Economics of AI"
- By The Financial District
- 14 minutes ago
- 1 min read
Nvidia has released a report showing how its Blackwell AI systems can scale up artificial intelligence more efficiently, just as Big Tech’s data-center buildout faces a massive power shortage, Laura Bratton reported for Yahoo Finance.

Unlike previous models that use all their parameters to generate answers for AI tools — requiring enormous computing power — Nvidia noted that leading frontier AI models use what’s called a “Mixture of Experts” (MoE) architecture that is more efficient. Examples include OpenAI’s GPT-OSS-120B model, DeepSeek’s R1, and Mistral AI’s Mistral Large 3.
In an MoE model, many specialized “experts” are trained for different types of tasks.
Instead of activating the entire model for every query, the model selects only the experts best suited to respond, reducing the computing power required while still improving performance.
However, today’s MoE models have limitations — they require significant memory and the use of multiple AI chips simultaneously.





![TFD [LOGO] (10).png](https://static.wixstatic.com/media/bea252_c1775b2fb69c4411abe5f0d27e15b130~mv2.png/v1/crop/x_150,y_143,w_1221,h_1193/fill/w_179,h_176,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/TFD%20%5BLOGO%5D%20(10).png)





