Former US State Department official Mike Benz recently shared an X post, highlighting China’s construction of a massive new mega-dam that is expected to significantly alter the flow of major rivers in Southeast Asia. In the post, he further suggests that large-scale infrastructure projects not only benefit China economically and strategically but also serve to unify its population by fostering national pride. He wrote “This project which will give China an unbelievable edge for decades in energy dominance costs $137 billion, which is just 20% of a single year of the Pentagon budget. Is there any reason we couldn't slash 20% of the Pentagon budget to create an annual "Great Works" fund?”
The post caught Elon Musk ’s attention, who replied to the post saying “As I said a few years ago, the AI scaling constraint will move from chips to voltage transformers to electricity generation. That is worrying for US leadership in AI long-term”.
What Elon Musk’s reply means
Elon Musk’s statement suggests that as AI technology advances, the biggest limitation (or "scaling constraint" as Musk called) to its growth will shift over time. AI models require powerful GPUs and specialized AI chips (like NVIDIA’s H100) to train and run efficiently. Early on, a limited supply of these chips slowed AI progress, as Must wrote in the post.
As companies scale AI infrastructure, they will need to distribute high amounts of power efficiently. Voltage transformers regulate and distribute electricity to data centers, and a shortage of these could hinder AI expansion.
Further, AI requires immense energy, particularly for training large models and running data centers. As AI scales, the demand for electricity will surge, and the ability to generate enough power (especially renewable energy) will become a major constraint, as highlighted by the tech CEO.
The post caught Elon Musk ’s attention, who replied to the post saying “As I said a few years ago, the AI scaling constraint will move from chips to voltage transformers to electricity generation. That is worrying for US leadership in AI long-term”.
As I said a few years ago, the AI scaling constraint will move from chips to voltage transformers to electricity generation.
— Elon Musk (@elonmusk) April 2, 2025
That is worrying for US leadership in AI long-term. https://t.co/eSv2yfAsJT
What Elon Musk’s reply means
Elon Musk’s statement suggests that as AI technology advances, the biggest limitation (or "scaling constraint" as Musk called) to its growth will shift over time. AI models require powerful GPUs and specialized AI chips (like NVIDIA’s H100) to train and run efficiently. Early on, a limited supply of these chips slowed AI progress, as Must wrote in the post.
As companies scale AI infrastructure, they will need to distribute high amounts of power efficiently. Voltage transformers regulate and distribute electricity to data centers, and a shortage of these could hinder AI expansion.
Further, AI requires immense energy, particularly for training large models and running data centers. As AI scales, the demand for electricity will surge, and the ability to generate enough power (especially renewable energy) will become a major constraint, as highlighted by the tech CEO.
You may also like
Is it the Waqf Board or the 'Land Mafia' Board?, asks UP CM Yogi Adityanath
'China took 4,000 km, we're cutting cake,' says Rahul Gandhi; BJP hits back with 'soup' jibe
BREAKING: Keir Starmer to give major speech today after Donald Trump's tariff bombshell
Lawns will 'grow thicker and deep green' if you add cheap bathroom staple
Good Morning Britain's Susanna Reid announces break from show as she's replaced by co-star