top of page

Nvidia Says Blackwell Is "Transforming the Economics of AI"

  • Writer: By The Financial District
    By The Financial District
  • 14 minutes ago
  • 1 min read

Nvidia has released a report showing how its Blackwell AI systems can scale up artificial intelligence more efficiently, just as Big Tech’s data-center buildout faces a massive power shortage, Laura Bratton reported for Yahoo Finance.


In an MoE model, many specialized “experts” are trained for different types of tasks. (Photo: Nvidia)
In an MoE model, many specialized “experts” are trained for different types of tasks. (Photo: Nvidia)
ree
ree

Unlike previous models that use all their parameters to generate answers for AI tools — requiring enormous computing power — Nvidia noted that leading frontier AI models use what’s called a “Mixture of Experts” (MoE) architecture that is more efficient. Examples include OpenAI’s GPT-OSS-120B model, DeepSeek’s R1, and Mistral AI’s Mistral Large 3.


In an MoE model, many specialized “experts” are trained for different types of tasks.


ree

Instead of activating the entire model for every query, the model selects only the experts best suited to respond, reducing the computing power required while still improving performance.


However, today’s MoE models have limitations — they require significant memory and the use of multiple AI chips simultaneously.



ree
ree
ree





TFD (Facebook Profile) (1).png
TFD (Facebook Profile) (3).png

Register for News Alerts

  • LinkedIn
  • Instagram
  • X
  • YouTube

Thank you for Subscribing

The Financial District®  2023

bottom of page