ChatGPT’s resource demands are getting out of control

Using the chatbot to write a 100-word email draws enough current to operate more than a dozen LED lightbulbs for an hour.

featured-image

It’s no secret that the growth of generative AI has demanded ever increasing amounts of water and electricity, but a new study from The Washington Post and researchers from University of California, Riverside shows just how many resources OpenAI’s chatbot needs in order to perform even its most basic functions. In terms of water usage, the amount needed for ChatGPT to write a 100-word email depends on the state and the users proximity to OpenAI’s nearest data center. The less prevalent water is in a given region, and the less expensive electricity is, the more likely the data center is to rely on electrically powered air conditioning units instead.

In Texas, for example, the chatbot only consumes an estimated 235 milliliters needed to generate one 100-word email. That same email drafted in Washington, on the other hand, would require 1,408 milliliters (nearly a liter and a half) per email. Data centers have grown larger and more densely packed with the rise of generative AI technology, to the point that air-based cooling systems struggle to keep up.



This is why many AI data centers have switched over to liquid-cooling schemes that pump huge amounts of water past the server stacks, to draw off thermal energy, and then out to a cooling tower where the collected heat dissipates. ChatGPT’s electrical requirements are nothing to sneeze at either. According to The Washington Post, using ChatGPT to write that 100-word email draws enough current to operate more than a dozen LED lightbulbs for an hour.

If even one-tenth of Americans used ChatGPT to write that email once a week for a year, the process would use the same amount of power that every single Washington, D.C., household does in 20 days.

D.C. is home to roughly 670,000 people.

This is not an issue that will be resolved any time soon, and will likely get much worse before it gets better. Meta, for example, needed 22 million liters of water to train its latest Llama 3.1 models .

Google’s data centers in The Dalles, Oregon, were found to consume nearly a quarter of all the water available in the town , according to court records, while xAI’s new Memphis supercluster is already demanding 150MW of electricity — enough to power as many as 30,000 homes — from the the local utility, Memphis Light, Gas and Water..