What happened to China’s AI boom, which just a matter of months ago was in full tilt and bursting at the seams? Last year saw millions of high-end (NASDAQ:NVDA) GPUs find their way across the country, wearily navigating tariffs and U.S. export restrictions.
But they braved this climate heroically. The AI hardware was in such high demand that specialized Nvidia H100 chips used to train AI models could sell for up to 200,000 yuan ($28,000) a pop in the black market. Meanwhile, hundreds of data centers were constructed to house the AI servers.
But now, a by MIT Technology Review has revealed that China’s short-lived AI boom has gone bust and government funding dried up, forcing project leads to sell off surplus GPUs and leaving newly built facilities sitting empty. China's rapid expansion of AI infrastructure was monolithic, but it didn’t come without severe challenges, most of which have been chalked up to poor planning and a misalignment between built capacities and actual demand. On top of that, it was a rush job, and many AI data centers were constructed without a clear understanding of the specific requirements for training versus inference workloads, leading to inefficiencies, according to , who sees a big bubble here, as reported by Barron’s.
Related: Peak Permian? Geology and Water Say We’re Close Digging deeper, we learn that China focused on training capabilities (inputting data) over inference (simplifying data), but both are the backbones of deep learning. China’s skewed focus means that the field is flooded with high-end GPUs because the “training” phase requires massive computational power and underutilized data centers. To make matters worse, according to MIT, multiple reports and industry insiders have revealed that certain companies used AI data centers to qualify for government-subsidized green energy or land deals.
In other cases, electricity dedicated to AI tasks was sold back into the grid at a profit. Other developers secured loans and tax incentives while leaving buildings unused. Indeed, most investors in China’s AI sector were looking to benefit from generous policy incentives rather than doing actual AI work, the report claims.
Last year, 144 companies registered with the Cyberspace Administration of China to develop their own Large Language Models ( ). However, only about 10% of those companies were still actively investing in LLM training by the end of the year. “ ,” Jimmy Goodrich, RAND Corporation senior advisor for technology, told MIT Technology Review.
Ironically, China’s own AI lab DeepSeek has played a big role in its AI woes. Last month, DeepSeek rattled Silicon Valley after its large language model outperformed American AI leaders, defying American attempts to stop China's high-tech ambitions. According to benchmarks posted on , R1 outperformed models from OpenAI, (NASDAQ:META) and Anthropic, who have spent billions of dollars building their models.
In a technical report, DeepSeek said its V3 model had a training cost of only $5.6 million–a fraction of the hundreds of millions or even billions of dollars U.S.
AI lab charges for its large language models. The fact that the Chinese AI lab was able to accomplish this despite having to navigate U.S.
semiconductor restrictions on China--which bans the export of powerful AI chips such as ’s (NASDAQ:NVDA) H100s--makes it all the more impressive. DeepSeek has forced many AI companies to rethink their requirements for hardware and scale. Still, Wall Street remains largely bullish on robust electricity demand growth despite the launch of cheaper and more power efficient DeepSeek AI models.
“ ” said Bloomberg Intelligence utilities analyst Nikki Hsu. “ .” Indeed, DeepSeek’s efficiency could even lead to more widespread use of AI.
According to Carlos Torres Diaz, head of power markets research for Rystad Energy, data centers may end up simply processing more data if they become more efficient. According to the Electric Power Research Institute (EPRI), data centers will gobble-up up to 9% of total electricity generated in the United States by the end of the decade, up from thanks to the rapid adoption of power-hungry technologies such as generative AI. For some perspective, last year, the U.
S. industrial sector energy consumed 1.02 million GWh, good for 26% of U.
S. electricity consumption..
Business
Is China’s Massive AI Bubble About To Burst in Record Time?

What happened to China’s AI boom, which just a matter of months ago was in full tilt and bursting at the seams? Last year saw millions of high-end Nvidia (NASDAQ:NVDA) GPUs find their way across the country, wearily navigating tariffs and U.S. export restrictions. But they braved this climate heroically. The AI hardware was in such high demand that specialized Nvidia H100 chips used to train AI models could sell for up to 200,000 yuan ($28,000) a pop in the black market. Meanwhile, hundreds of data centers were constructed to house...