Data Centers Environmental Impact Set To Triple By 2030: Can New Computing Architectures Save The Day?

A new report by World Fund, Ignite, and Dealroom, turns the spotlight on new material innovations and next-generation computing paradigms.

featured-image

TOPSHOT - Joel Kjellgren, Data Center Manager walks in one of the server rooms at the Facebook Data ...

More Center on November 7, 2013 in Lulea, in Swedish Lapland. The environmental impact of data centers, mainly due to the growing demands of AI computing, is already huge, and is only bound to increase. Major tech companies are investing in renewable energy sources, but due to the frantic pace of development – AI-related demand for computing power is doubling every three to four months – that alone won’t solve the problem.



Currently, estimates from the International Energy Agency hold data centers responsible for 1% of the world’s electricity consumption and 0.6% of global greenhouse emissions, figures that are projected to grow by 2030 to 2.8% and 1.

9% respectively. Could the solution lie in reimagining computing architecture themselves, to make them more efficient and environmentally friendly? A new report by the World Fund, a European VC investing in clean tech companies, Intel’s startup program Ignite, and Dealroom turns the spotlight on new material innovations that could provide alternatives to traditional silicon-based semiconductors, and next-generation computing paradigms, such as quantum computing and neuromorphic systems that could help mitigate the issue. The white paper focuses specifically on European startups and scaleups operating in these sectors and provides a comprehensive overview of the latest opportunities for clean tech investors.

Funding in this sector is soaring: the white paper highlights 65 green computing startups that have already raised $1.5billion. However, it should be noted that none of the solutions presented is in itself a silver bullet to address AI’s energy consumption, and that additional technical and geopolitical considerations beyond the scope of the report should be taken into account, when considering their implementation.

Take, for instance, Gallium Nitride, or GaN, a wideband gap semiconductor (simply speaking, a material in which there is a wide energy difference between the band where electrons reside and the conduction band, where electrons can move freely), which has emerged as one of the most promising alternatives to silicon. GaN semiconductors promise significant efficiency gains – up to 40–50% compared to traditional silicon power transistors​, and they also enable more compact designs; for instance, Texas Instruments demonstrated GaN-based adapters that are about 50% smaller, while still achieving power conversion efficiency​. European GaN startups have collectively raised an impressive $70 billion over the last five years, with companies like GaN Systems, Cambridge GaN Devices and Hexagem leading innovation in this space.

This clearly shows that there is a perceived market opportunity, as well as a potential benefit for the environment, as the widespread adoption of GaN could cut global CO2 emissions by up to 2.6 gigatons annually by 2050. However, scaling up GaN technology comes with manufacturing and supply-chain limitations.

Wide-bandgap semiconductors like GaN (or Silicon Carbide, another alternative to silicon mentioned in the report) are currently more expensive to produce, can be prone to crystal defects, and have a less mature supply chain​. In fact, the raw material gallium is a strategic choke point – China produces an estimated 98% of the world’s gallium supply​, raising concerns about resource constraints. These factors make it challenging to ramp up GaN production quickly, which in turn limits how rapidly the efficiency benefits can be realized at scale.

Of course, these challenges can, and probably will, be overcome over time. The question is: when? As things stand, rather than adopting innovations that might require the rebuilding or restructuring of entire supply chains, in the short to mid-term, big tech companies are more likely to keep pace with the rushing advances in artificial intelligence by relying more on fossil-fuel or nuclear-generated energy. In the U.

S., for instance, there are already cases of coal-fired power plants whose closure has been postponed , and there are plans to build more than 200 gas (natural gas) power plants in the next few years. This is not to downplay the promises of new material innovations, but to avoid creating overblown expectations either.

Similar caveats apply to next-generation computing paradigms, such as quantum computing, of which Europe is a global leader. The continent’s quantum startups raised $781M in 2023 - triple the amount raised in North America. The efficiency gains are potentially revolutionary: quantum systems can solve certain problems 100 million times faster than classical computers, translating directly to massive energy savings.

Experts estimate that quantum-enabled innovations across various sectors could reduce global emissions by up to 7 gigatons annually by 2035—representing an 18% reduction in total global emissions. However, it’s unclear whether and when quantum computing could become a general-purpose technology ready to speed up arbitrary AI workloads​. Generally speaking, today’s quantum computers are highly specialized machines that excel at specific optimization or simulation tasks , they also require complex cryogenics and control systems that themselves consume significant energy.

It is true, though, that as the report notes, “quantum simulations are already accelerating breakthroughs” in fields such as “battery chemistry”. “These advances lay the groundwork for next-generation high-performance batteries, which, if scaled effectively, could mitigate up to 14 gigatons of emissions annually by 2035,” the authors say. And of course, it’s always possible that new breakthroughs change the rules of the game and quantum computing quickly becomes an all-purpose solution, applicable to all sorts of situations.

I reached out to the report’s authors for comment on a possible timeline for implementation of the new materials and techniques highlighted in the report. “While it is true that some solutions explored in our report (e.g.

, biocomputing) are still in their early research and development phases, there are definitely solutions that can help limit the damage in the short-to mid-term. Technologies such as GaN and SiC are already available today, and as they scale, increased usage will help drive immediate efficiency savings and decreased power consumption. Software efficiency improvements, such as the model optimization recently seen by DeepSeek, also have the potential to make a more immediate impact.

We also expect significant near-term positive impact from both quantum and optical computing,” a spokesperson for World Fund said. They also emphasize the need to deliver cleaner sources of energy alongside next-gen computing innovations, pointing at Google’s collaboration with Fervo to provide clean base load power for its data centers or Amazon’s investment into X-energy, to provide another source of clean baseload power from small modular reactors, as positive examples. There’s much more information in the white paper that can be covered here; for instance, a chapter discusses software-based approaches – like data compression and server virtualization – as ways to curb energy use.

An EPA report has found that aggressive virtualization and consolidation can reduce total server energy consumption by as much as 80% in certain data center setups​. All in all, though, it seems fair to say that there’s no magic plan, no silver bullet, that will solve AI’s environmental footprint overnight; instead, a combination of hardware advances, software optimizations, and clean energy transitions could provide the best way to go​. It is also clear that, due to the scale of the challenge and the pace of increase in data center energy consumption, even imperfect steps today are better than perfect solutions tomorrow.

.